Tech

'Tickling' an atom to investigate the behavior of materials

image: The animation shows the vibration energy of the silicon atom in the graphene crystal

Image: 
D. Kepaptsoglou, SuperSTEM

Scientists and engineers working at the frontier of nanotechnology face huge challenges. When the position of a single atom in a material may change the fundamental properties of that material, scientists need something in their toolbox to measure how that atom will behave.

A research team led by the University of Leeds, in collaboration with colleagues at the Sorbonne University in Paris, France, have shown for the first time that it is possible to develop a diagnostic technique loosely related to the idea of a tuning fork.

A tuning fork produces a fixed tone when energy is applied to it - in that case, when it is struck. But if the fork is somehow altered, it goes out of tune: the tone changes.

The technique being used by the research team involves firing a beam of electrons at a single atom in a solid. That energy stream causes it and the atoms that surround it to vibrate.

This creates a unique vibrational energy fingerprint, akin to the fixed tone from a tuning fork, which can be recorded by an electron microscope. But if a single atom impurity is present, another chemical element, for instance, the vibrational energy fingerprint of that impurity will change: the material will 'sound' different at this precise location.

The research opens up the possibility that scientists will be able to monitor materials for atomic impurities.

The findings, Single Atom Vibrational Spectroscopy in the Scanning Electron Microscope, are published today (March 5) in the journal Science.

Quentin Ramasse, Professor of Advanced Electron Microscopy at Leeds who led the project, said: "We now have direct evidence that a single "foreign" atom in a solid can change its vibrational property at the atomic scale.

"This has been predicted for decades, but there has not been any experimental technique to observe these vibrational changes directly. We have been able to show for the first time that you can record that defect signature with atomic precision."

The researchers used the SuperSTEM Laboratory, the UK National Research Facility for Advanced Electron Microscopy, supported by the Engineering and Physical Research Council (EPSRC).

The facility houses some of the most advanced facilities in the world for investigating the atomic structure of matter, and is operated under the auspices of an academic consortium led by the University of Leeds (also including the Universities of Oxford, York who were involved in this project, as well as Manchester, Glasgow and Liverpool).

The scientists located a single impurity atom of silicon in a large graphene crystal (a form of carbon only one atom thick) - and then focused the beam of their electron microscope directly on that atom.

Professor Ramasse said: "We are hitting it with an electron beam, which makes the silicon atom move around or vibrate, absorbing some of the energy of the incoming beam of electrons in the process - and we are measuring the amount of energy that is being absorbed."

The animation illustrates schematically how the silicon vibrates, and how that vibration begins to affect neighbouring atoms, and is inspired by extensive theoretical calculations by the team of Dr Guillaume Radtke at the Sorbonne University, who collaborated on this project.

(The animation can be downloaded from: https://drive.google.com/open?id=1DIct9Wg6lqd6PdO2stLKK8orq1Idcezq

"The vibrational response we observe is unique to how this particular silicon atom is located within the graphene lattice," added Dr Radtke. "We could predict how its presence would perturb the surrounding network of carbon atoms, but these experiments represent a real technical achievement because we are now able to measure with atomic precision such a subtle change."

Credit: 
University of Leeds

Cool beans: A vertical crop fit for Africa's changing climate and nutritional gaps

image: Beans in a market in Rwanda.

Image: 
Neil Palmer - International Center for Tropical Agriculture

Growing more climbing beans, as opposed to lower-yield bush beans, could help increase food security in sub-Saharan Africa as demand for food increases, climate change becomes more pronounced, and arable land becomes scarcer, according to a new study. Researchers mapped suitable cultivation areas and modeled future scenarios for 14 countries. The results indicate where specialists can target to promote climbing bean cultivation in areas that are highly suitable for the crop and not yet cultivated.

"Climate change is making it more difficult for Africa to produce food," said Glenn Hyman, a co-author and environmental scientist at Spatial Informatics Group. "Yields are expected to go down. We're proposing climbing beans as an intensification solution, mostly because they yield three times more than bush beans."

Varieties of the common bean, Phaseolus vulgaris, are essential for nutrition and income for millions in sub-Saharan Africa. Sustaining the growing export trade while satisfying domestic demand will require a substantial increase in yield from existing cropland. But expansion to new lands is no longer feasible in most countries.

The research was published in January in Mitigation and Adaptation Strategies for Global Change. Co-authors included scientists from the Alliance of Bioversity International and the International Center for Tropical Agriculture (CIAT), and Colombia's Universidad del Valle.

Though experts predict that higher temperatures and less rainfall will make many areas inhospitable even for climbing beans, they argue there are still places that will become more suitable for the variety. These regions may produce little of it at the moment, but their future climate and soil conditions offer great opportunities to boost their yields.

To identify these areas, researchers used models to project the future geographic distribution of beans and overlapped them with their present ranges. "We compared the current distribution of climbing beans with their suitability for the land and climate. There are some places with good conditions to grow the bean, but with no current production," said Hyman.

Models suggest climbing beans can now find suitable hotspots in the Great African Lakes region, and parts of Ethiopia, Cameroon, and Zimbabwe, while Rwanda will become increasingly fit for the crop. But in the future, over half of the countries in the study will become less suitable, with major changes across southern Africa, in Zambia, Zimbabwe, Mozambique, Malawi, and the southern part of Tanzania.

Climbing beans can adapt to Africa's future climates in several ways, say the authors. One would be farmers intensifying cultivation in high-producing areas that will remain fit under future climate change. Another would be to start expanding crops to areas where rainfall, temperatures and soils will meet the crop's needs in the future.

Experts warn that accurately predicting bean distribution requires data from more locations. At the same time, the study doesn't account for the potential effect of higher carbon dioxide emissions on increasing photosynthesis, and thus bean yields, known as the CO2 fertilization effect. Factoring in these variables could provide more detailed maps for targeting adaptation for climbing beans.

The stalks are high

By 2030, higher temperatures and more frequent droughts might cut common bean production by 3-5 percent across Africa, undermining the continent's capacity to meet its needs for the legume. Beans lie at the core of East Africans' diet, providing important balancing nutrients, such as proteins and minerals. As booming populations and urbanizations drive up bean demand, sub-Saharan Africa needs to boost its bean yields, yet experts say that most countries cannot further expand their crops.

Climbing beans might help not only Africa, but also other regions growing beans in similar conditions - such as Central America and the Andes, especially with the incorporation of heat- and drought-tolerant varieties. This should happen in tandem with improving common bean varieties to cope with climate change and intensifying production where the land allows, as part of bigger adaptation efforts.

Credit: 
The Alliance of Bioversity International and the International Center for Tropical Agriculture

Scientists create model to predict multipathogen epidemics

image: Patrick Clay is a postdoctoral researcher at the University of Michigan.

Image: 
University of Michigan

HOUSTON -- (March 5, 2020) -- Diseases often pile on, coinfecting people, animals and other organisms that are already fighting an infection. In one of the first studies of its kind, bioscientists from Rice University and the University of Michigan have shown that interactions between pathogens in individual hosts can predict the severity of multipathogen epidemics.

In lab experiments, scientists explored how the timing of bacterial and fungal infections in individual zooplankton impacted the severity of bacterial and fungal epidemics in zooplankton populations. The study, published this week in the Proceedings of the Royal Society B, showed that the order of infections in individual hosts can change the course of an epidemic.

"It's well known that the way parasites and pathogens interact within hosts can alter disease transmission, but the question has been, 'What information do you need to gather about those interactions to be able to predict the severity of an epidemic?'" said corresponding author Patrick Clay, a University of Michigan postdoctoral associate who conducted the research during his Ph.D. studies at Rice.

"What we showed is that you need to understand how infection order alters within-host interactions to be able to predict the severity of epidemics," he said. "We particularly need this information to predict how changes in the timing of an epidemic relative to co-occurring epidemics alters epidemic severity."

The research does not apply to the coronavirus.

"This applies to situations where multiple epidemics are simultaneously occurring and where the co-occurring pathogens interact within hosts," said study co-author Volker Rudolf, Clay's Ph.D. adviser at Rice. "There is no data to suggest that this is the case for COVID-2019."

But coinfections are common in humans and wildlife populations, and because they are difficult to study, much is still unknown about how they alter epidemic dynamics, Rudolf said.

"Disease biology and epidemiology have historically focused on one-on-one interactions: one pathogen, one host," said Rudolf, a professor of biosciences at Rice. "However, scientists increasingly recognize that diseases don't exist in a vacuum. In reality, a diverse community of parasites and pathogens are out there, and they interact with each other. This study emphasizes a more holistic, almost community type of approach to studying infectious diseases."

The study combined experiments with epidemiology models and computer simulations.

The zooplankton species used in the experiments, Daphnia dentifera, is a small crustacean that's both abundant and ecologically important in lakes across the U.S. Midwest. Zooplankton are also transparent, and Clay used a microscope to detect and monitor the growth of fungal and bacterial spores inside the animals. By altering the order of infection in test populations and examining thousands of individuals, he was able to document crucial differences in the way the pathogens interacted inside hosts.

"The order in which pathogens infect the host determines how they interact in the host, and this interaction ultimately determines how epidemics play out in the entire host population," he said.

Clay created epidemiological models that predicted the dynamics of fungal and bacterial epidemics in zooplankton populations that had one or both infections, when either the fungal or bacterial epidemic came before the other. He also spent months monitoring how epidemics progressed over many generations of infected and coinfected zooplankton populations.

By comparing data from experimental epidemics and computer simulations, Clay and Rudolf were able to determine what information was crucial to reliably predict how diseases would spread in coinfected populations.

"When we used models that ignored interactions between pathogens within coinfected hosts completely, the predictions were terrible," Rudolf said. "The models improved somewhat once we accounted for interactions between coinfecting pathogens, but it was still completely off on what proportion of individuals got infected. Only when we also specifically accounted for how the order of pathogen infections altered interactions within a host were we able to accurately predict epidemics, including how long an epidemic lasts and the proportion of hosts the pathogens infected."

Clay said the work is important because climate change will alter the timing of seasonal epidemics in many species, and understanding the impact of timing on epidemics will be crucial for ecologists. And his coinfection model could be useful for studying the severity of overlapping epidemics in other species.

"The model does not have anything in it that is specific to zooplankton," he said. "It uses parameters like the rate at which hosts are dying, the rate at which new hosts are being born, the rate at which the disease is transmitted, and you can measure those for any disease."

But the crucial metrics from his experiments, the data describing how order of infection altered within-host interactions between pathogens, are often either unknown or uncollected. And Clay hopes that will change.

"I hope this will influence people to look at what happens when you alter the order of infection in coinfected hosts," he said. "Because we've shown that that information is vital for predicting epidemics, and if people start gathering that information now they will be more prepared to predict the severity of future epidemics."

Credit: 
Rice University

Research shows microplastics are damaging to coral ecosystems

Microplastics are a growing environmental concern, and the effects of this waste product on coral are highlighted in research published in Chemosphere from an international team of researchers including UConn marine science professor Senjie Lin.

Plastic discarded into the environment breaks down into smaller and smaller fragments, called microplastics once they measure less than five millimeters. Microplastics are widespread throughout the environment and are ingested by animals at all levels of the food web, starting from the smallest organisms all the way to apex predators, including humans.

Plastics contain hazardous compounds such as bisphenol A (BPA), flame retardants, and other known carcinogens or endocrine disruptors. Plastics can also easily absorb toxins from the environment, such as trace metals and organic pollutants like PCBs.

In marine environments, very small animals such as protists, phytoplankton, and others are also subject to the detrimental impacts of microplastics, which presents a significant problem for coral, which rely on symbiotic relationships between different organisms, says Lin.

"Coral ecosystems are very collaborative," he says. "Corals are invertebrates who rely on algae who live inside the corals and photosynthesize energy-rich and nutritional compounds for the corals. The algae in turn receive nutrients from the corals' metabolic wastes. It is a very mutualistic system."

Beyond the collaboration between coral and endosymbionts, corals provide habitat for a stunning array of marine life, says Lin.

"They are the most biodiverse ecosystems in the ocean," he says. "They are an extremely valuable biological resource."

Unfortunately, these ecosystems face large and growing threats, including global warming, pollution, and physical destruction from human activities.

Lin and his fellow researchers wanted to explore the effects of microplastics on a common tropical coral reef dweller. The researchers looked at specific endosymbionts called Symbiodiniaceae because they are the most prevalent photosynthetic symbionts in coral ecosystems in tropical and subtropical waters. The species of Symbiodiniaceae they focused on is called Cladocopium goreaui.

The team started by rearing the algal cells and divided them into groups. Then some of the algae were exposed to microplastics.

After about a week, the group exposed to microplastics experienced significant reduction in population size as well as cell size, even though chlorophyll content increased slightly relative to the control group, the latter of which might be due to the shading effect of the microplastics.

The team also measured the activities of enzymes related to stress response and detoxification within the cells.

The researchers noticed elevation in a component called superoxide dismutase (SOD), and a significant decrease in glutathione s-transferase (GST). The team also found that a key enzyme in signaling cell death or apoptosis was raised. These changes raise cell stress levels and depress the cell's ability to detoxify itself, both culminating in declines in the health of the algal cells exposed to microplastics.

The researchers also looked at differences in gene regulation between the groups. They found 191 genes that were differentially expressed, including genes related to immune function, photosynthesis, and metabolism. The gene regulation data showed that microplastics can act as stressors, impact nutrient uptake, suppress cell detoxification activities, impact photosynthesis, and increase the chances a cell will self-destruct.

"The emerging pollutant of microplastics are found to effect coral health and have a direct effect on the endosymbiont after they are exposed to microplastics," says Lin.

This is particularly troubling, as Lin notes that world-wide, coral reefs have already seen a nearly 50% decline.

"This is a major loss and it is predicted that we will continue to lose 90% by 2050 if we do not do anything to slow the loss," he says. "This is a grave issue and we need to act fast."

Going forward, Lin says he plans to research the effects of microplastics on phytoplankton, the primary producers in the ocean, as well as continuing to research how corals are impacted.

"Phytoplankton are at the base of the marine food chain, if they are effected there is potential that the entire food chain and whole marine ecosystem will experience the impact," he says.

Lin says that since microplastics persist for so long in the environment, the best thing individuals can do right now is minimize the use of plastic in our daily lives. Microplastics are not a problem that will go away any time soon, but Lin is confident that minimizing the use of plastic it will have a direct impact on better preserving the environment.

"I am quite optimistic, though the current situation is pretty dire," he says. "Through the course of history, corals have gone through dire climate and environmental changes like today, even worse in some cases. The only thing right now is the environmental changes are happening faster than historical or natural processes, we are not giving the corals enough time to adapt. If we take action now to slow or stop interference to the environment, there is hope."

Credit: 
University of Connecticut

Corn productivity in real time: Satellites, field cameras, and farmers team up

image: University of Illinois doctoral student Hyungsuk Kimm set up a network of cameras in corn fields around Illinois to ground-truth satellite-based algorithms to monitor corn productivity in real time.

Image: 
Image courtesy of Hyungsuk Kimm, University of Illinois.

URBANA, Ill. - University of Illinois scientists, with help from members of the Illinois Corn Growers Association, have developed a new, scalable method for estimating crop productivity in real time. The research, published in Remote Sensing of Environment, combines field measurements, a unique in-field camera network, and high-resolution, high-frequency satellite data, providing highly accurate productivity estimates for crops across Illinois and beyond.

"Our ultimate goal is to provide useful information to farmers, especially at the field level or sub-field level. Previously, most available satellite data had coarse spatial and/or temporal resolution, but here we take advantage of new satellite products to estimate leaf area index (LAI), a proxy for crop productivity and grain yield. And we know the satellite estimates are accurate because our ground measurements agree," says Hyungsuk Kimm, a doctoral student in the Department of Natural Resources and Environmental Sciences (NRES) at U of I and lead author on the study.

Kimm and his colleagues used surface reflectance data, which measures light bouncing off the Earth, from two kinds of satellites to estimate LAI in agricultural fields. Both satellite datasets represent major improvements over older satellite technologies; they can "see" the Earth at a fine scale (3-meter or 30-meter resolution) and both return to the same spot above the planet on a daily basis. Since the satellites don't capture LAI directly, the research team developed two mathematical algorithms to convert surface reflectance into LAI.

While developing the algorithms to estimate LAI, Kimm worked with Illinois farmers to set up cameras in 36 corn fields across the state, providing continuous ground-level monitoring. The images from the cameras provided detailed ground information to refine the satellite-derived estimates of LAI.

The true test of the satellite estimates came from LAI data Kimm measured directly in the corn fields. Twice weekly during the 2017 growing season, he visited the fields with a specialized instrument and measured corn leaf area by hand.

In the end, the satellite LAI estimates from the two algorithms strongly agreed with Kimm's "ground-truth" data from the fields. This result means the algorithms delivered highly accurate, reliable LAI information from space, and can be used to estimate LAI in fields anywhere in the world in real time.

"We are the first to develop scalable, high-temporal, high-resolution LAI data for farmers to use. These methods have been fully validated using an unprecedented camera network for farmland," says Kaiyu Guan, assistant professor in the Department of NRES and Blue Waters professor at the National Center for Supercomputing Applications. He is also principal investigator on the study.

Having real-time LAI data could be instrumental for responsive management. For example, the satellite method could detect underperforming fields or segments of fields that could be corrected with targeted management practices such as nutrient management, pesticide application, or other strategies. Guan plans to make real-time data available to farmers in the near future.

"The new LAI technology developed by Dr. Guan's research team is an exciting advancement with potential to help farmers identify and respond to in-field problems faster and more effectively than ever before," says Laura Gentry, director of water quality research for the Illinois Corn Growers Association.

"More accurate measurements of LAI can help us to be more efficient, timely, and make decisions that will ultimately make us more profitable. The last few years have been especially difficult for farmers. We need technologies that help us allocate our limited time, money, and labor most wisely. Illinois Corn Growers Association is glad to partner with Dr. Guan's team, and our farmer members were happy to assist the researchers with access to their crops in validating the team's work. We're proud of the advancement this new technology represents and are excited to see how the Guan research team will use it to bring value directly to Illinois farmers," Gentry adds.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Many Lyme disease cases go unreported; a new model could help change that

image: A data model developed by Columbia researchers showed about 162 U.S. counties may have Lyme disease cases not yet been reported to the CDC.

Image: 
US Centers for Disease Control and Prevention

The Centers for Disease Control and Prevention receives reports of about 30,000 cases of Lyme disease each year. The real number, according to the agency, is closer to 300,000.

Underreporting affects the ability of public health authorities to assess risk, allocate resources and devise prevention strategies. It also makes early detection very difficult, hampering efforts to treat the condition quickly and effectively.

A new report, published on March 3, 2020, in Journal of the American Medical Association, describes a data model developed by researchers from Columbia University and RTI International, a nonprofit research institute, that helps identify areas of the United States where Lyme disease cases may go unreported.

"We believe our analysis can help predict the trajectory of where Lyme disease will spread," said Maria Pilar Fernandez, a post-doctoral researcher at Columbia and lead author of the study. "Identifying high-risk areas can lead to surveillance in counties and areas where infections are likely to emerge. It also allows authorities to alert physicians and the public, which can lead to early treatment, when it is most effective."

To develop their model, the researchers analyzed publicly available data, tracking the geographic spread of Lyme disease over nearly two decades. They studied an estimated 500,000 cases of the illness reported to CDC from different counties across the United States between 2000 and 2017.

Lyme disease is difficult to diagnose, and accurate case assessment depends on many variables, the researchers said, from provider awareness and testing methods to reporting practices, state budgets and personnel.

"We were able to show that about 162 U.S. counties may already have Lyme disease, but they have not yet been reported to the CDC," said Maria Diuk-Wasser, associate professor in the Department of Ecology, Evolution and Environmental Biology at Columbia and a co-author on the study.

The CDC collects Lyme disease data from state and local health departments, which base the number of cases on notifications from clinicians, hospitals and laboratories.

Lyme disease is difficult to diagnose, and accurate case assessment depends on many variables, the researchers said, from provider awareness and testing methods to reporting practices, state budgets and personnel.

Although Lyme disease has been diagnosed in almost every state, most cases reported to the CDC are in the Northeast and upper Midwest.

If diagnosed early--a rash commonly appears around the site of the tick bite--the condition can be effectively treated with antibiotics. Longer term infections can produce more serious symptoms, including joint stiffness, brain inflammation and nerve pain.

Models have been created in the past to identify high-risk areas in a few states or regions in the United States, but the new one expands the geographic scope to all areas in the U.S. where the disease is most likely to occur.

"In the future, the model can be expanded," Fernandez said. "We hope to continue to keep track of the spread and inform authorities about areas where Lyme disease is likely to emerge."

Credit: 
Columbia University

Layered solar cell technology boosts efficiency, affordability

image: Perovskite/silicon tandem solar cells are contenders for the next-generation photovoltaic technology, with the potential to deliver module efficiency gains at minimal cost.

Image: 
Dennis Schroeder / NREL

The future's getting brighter for solar power. Researchers from CU Boulder have created a low-cost solar cell with one of the highest power-conversion efficiencies to date, by layering cells and using a unique combination of elements.

"We took a product that is responsible for a $30 billion a year industry and made it 30% better," said Michael McGehee, a professor in the Department of Chemical and Biological Engineering and co-author of a paper, to be published tomorrow in Science, which describes the technology. "That's a big deal."

The researchers took a perovskite solar cell, a crystal structure that's designed to harvest higher energy photons, and layered it on top of a silicon solar cell, which captures more photons in the infrared part of the spectrum - which is made up of radiant energy that we cannot see, but we can feel as heat. Combined, the perovskite raises a 21% silicon solar cell up to an efficiency of 27% - increasing it by a third.

For years, silicon solar cells have been the standard in the solar power industry. But current silicon-based cells only convert 18% to 21% of the sun's energy into usable electricity on average, and they max out at about 26.6%.

This means it now costs more to install the cells than to buy them, said McGehee, a fellow in the Renewable & Sustainable Energy Institute.

The average efficiency of solar panels is lower than the maximum efficiency, because no matter how good an individual, small solar cell is, it will lose about three percentage points when applied over a large panel - kind of like a sports team only being as good as its average player. But if you can raise the overall efficiency, you don't have to install as many panels to get the same amount of power.

What dramatically improves efficiency is to put another solar cell on top of an existing one - and that's exactly what McGehee and his fellow researchers did.

An affordable secret formula

This isn't the first time researchers have layered solar cells to gain efficiency. The concept, also known as tandem or multi-junction solar cells, was first introduced in the 1970s - and the world record for solar cell efficiency is already over 45%. However, it came at a hefty price: $80,000 per square meter, due to the fact the cells were grown one atomic layer at a time, creating one big, single crystal. Probably not a cost the average homeowner or business can afford.

McGehee and his fellow researchers are the pioneers in a new direction of layered solar cells, using perovskites, which cost more than a hundred times less.

They started fewer than 10 years ago with the concept of using less expensive materials on top of the silicon, and at first only achieved about 13% efficiency. But through technological improvements they have been able to more than double that number.

Their secret formula involves a unique triple-halide alloy of chlorine, bromine, and iodine.

In solar cells, there is an ideal bandgap, according to McGehee. This is the space between energy levels in a semiconductor, which electrons jump between and create electrical energy.

Bromine can raise this bandgap, but when used with iodine and exposed to light, these elements don't always stay in place. Previous studies have tried to use chlorine and iodine together, but due to the differing particle sizes of these elements, not enough chlorine could fit into the perovskite crystal structure. But by using different amounts of chlorine, bromine, and iodine, the researchers figured out a way to shrink the crystal structure, allowing more chlorine to fit in - stabilizing and improving the cell's efficiency.

Perovskites are also inexpensive, not energy intensive to make and easy to create in the lab. And even after 1000 hours - or almost 42 days - of intensive light and heat testing, these new solar cells showed a minimal change in their initial efficiency.

With the solar power market growing around 30% per year, efficiency, cost and longevity are major considerations for which new technologies will become mainstream.

McGehee is optimistic about the potential of this wide-bandgap, layered perovskite solar cell.

Not only has it now surpassed the maximum efficiency of a silicon-only solar cell, "we believe it can take us over 30% efficiency and that it can be stable," said McGehee.

Credit: 
University of Colorado at Boulder

New approach to sustainable building takes shape in Boston

A new building about to take shape in Boston's Roxbury area could, its designers hope, herald a new way of building residential structures in cities.

Designed by architects from MIT and the design and construction firm Placetailor, the five-story building's structure will be made from cross-laminated timber (CLT), which eliminates most of the greenhouse-gas emissions associated with standard building materials. It will be assembled on site mostly from factory-built subunits, and it will be so energy-efficient that its net carbon emissions will be essentially zero.

Most attempts to quantify a building's greenhouse gas contributions focus on the building's operations, especially its heating and cooling systems. But the materials used in a building's construction, especially steel and concrete, are also major sources of carbon emissions and need to be included in any realistic comparison of different types of construction.

Wood construction has tended to be limited to single-family houses or smaller apartment buildings with just a few units, narrowing the impact that it can have in urban areas. But recent developments -- involving the production of large-scale wood components, known as mass timber; the use of techniques such as cross-laminated timber; and changes in U.S. building codes -- now make it possible to extend wood's reach into much larger buildings, potentially up to 18 stories high.

Several recent buildings in Europe have been pushing these limits, and now a few larger wooden buildings are beginning to take shape in the U.S. as well. The new project in Boston will be one of the largest such residential buildings in the U.S. to date, as well as one of the most innovative, thanks to its construction methods.

Described as a Passive House Demonstration Project, the Boston building will consist of 14 residential units of various sizes, along with a ground-floor co-working space for the community. The building was designed by Generate Architecture and Technologies, a startup company out of MIT and Harvard University, headed by John Klein, in partnership with Placetailor, a design, development, and construction company that has specialized in building net-zero-energy and carbon-neutral buildings for more than a decade in the Boston area.

Klein, who has been a principal investigator in MIT's Department of Architecture and now serves as CEO of Generate, says that large buildings made from mass timber and assembled using the kit-of-parts approach he and his colleagues have been developing have a number of potential advantages over conventionally built structures of similar dimensions. For starters, even when factoring in the energy used in felling, transporting, assembling, and finishing the structural lumber pieces, the total carbon emissions produced would be less than half that of a comparable building made with conventional steel or concrete. Klein, along with collaborators from engineering firm BuroHappold Engineering and ecological market development firm Olifant, will be presenting a detailed analysis of these lifecycle emissions comparisons later this year at the annual Passive and Low Energy Architecture (PLEA) conference in A Coruña, Spain, whose theme this year is "planning post-carbon cities."

For that study, Klein and his co-authors modeled nine different versions of an eight-story mass-timber building, along with one steel and one concrete version of the building, all with the same overall scale and specifications. Their analysis showed that materials for the steel-based building produced the most greenhouse emissions; the concrete version produced 8 percent less than that; and one version of the mass-timber building produced 53 percent less.

The first question people tend to ask about the idea of building tall structures out of wood is: What about fire? But Klein says this question has been thoroughly studied, and tests have shown that, in fact, a mass-timber building retains its structural strength longer than a comparable steel-framed building. That's because the large timber elements, typically a foot thick or more, are made by gluing together several layers of conventional dimensioned lumber. These will char on the outside when exposed to fire, but the charred layer actually provides good insulation and protects the wood for an extended period. Steel buildings, by contrast, can collapse suddenly when the temperature of the fire approaches steel's melting point and causes it to soften.

The kit-based approach that Generate and Placetailor have developed, which the team calls Model-C, means that in designing a new building, it's possible to use a series of preconfigured modules, assembled in different ways, to create a wide variety of structures of different sizes and for different uses, much like assembling a toy structure out of LEGO blocks. These subunits can be built in factories in a standardized process and then trucked to the site and bolted together. This process can reduce the impact of weather by keeping much of the fabrication process indoors in a controlled environment, while minimizing the construction time on site and thus reducing the construction's impact on the neighborhood.

"It's a way to rapidly deploy these kinds of projects through a standardized system," Klein says. "It's a way to build rapidly in cities, using an aesthetic that embraces offsite industrial construction."

Because the thick wood structural elements are naturally very good insulators, the Roxbury building's energy needs for heating and cooling are reduced compared to conventional construction, Klein says. They also produce very good acoustic insulation for its occupants. In addition, the building is designed to have solar panels on its roof, which will help to offset the building's energy use.

The team won a wood innovation grant in 2018 from the U.S. Forest Service, to develop a mass-timber based system for midscale housing developments. The new Boston building will be the first demonstration project for the system they developed.

"It's really a system, not a one-off prototype," Klein says. With the on-site assembly of factory-built modules, which includes fully assembled bathrooms with the plumbing in place, he says the basic structure of the building can be completed in only about one week per floor.

"We're all aware of the need for an immediate transition to a zero-carbon economy, and the building sector is a prime target," says Andres Bernal SM '13, Placetailor's director of architecture. "As a company that has delivered only zero-carbon buildings for over a decade, we're very excited to be working with CLT/mass timber as an option for scaling up our approach and sharing the kit-of-parts and lessons learned with the rest of the Boston community."

With U.S. building codes now allowing for mass timber buildings of up to 18 stories, Klein hopes that this building will mark the beginning of a new boom in wood-based or hybrid construction, which he says could help to provide a market for large-scale sustainable forestry, as well as for sustainable, net-zero energy housing.

"We see it as very competitive with concrete and steel for buildings of between eight and 12 stories," he says. Such buildings, he adds, are likely to have great appeal, especially to younger generations, because "sustainability is very important to them. This provides solutions for developers, that have a real market differentiation."

He adds that Boston has set a goal of building thousands of new units of housing, and also a goal of making the city carbon-neutral. "Here's a solution that does both," he says.

Credit: 
Massachusetts Institute of Technology

Downsizing the McMansion: Study gauges a sustainable size for future homes

What might homes of the future look like if countries were really committed to meeting global calls for sustainability, such as the recommendations advanced by the Paris Agreement and the U.N.'s 2030 Agenda for Sustainable Development?

Much wider adoption of smart design features and renewable energy for low- to zero-carbon homes is one place to start -- the U.N. estimates households consume 29% of global energy and consequently contribute to 21% of resultant CO2 emissions, which will only rise as global population increases.

However, a new scholarly paper authored at New Jersey Institute of Technology (NJIT) assesses another big factor in the needed transformation of our living spaces toward sustainability-- the size of our homes.

The paper published in the journal Housing, Theory & Society makes the case for transitioning away from the large, single-family homes that typify suburban sprawl, offering new conceptions for what constitutes a more sustainable and sufficient average home size in high-income countries going forward.

The article surveys more than 75 years of housing history and provides estimates for the optimal spatial dimensions that would align with an "environmentally tenable and globally equitable amount of per-person living area" today. It also spotlights five emerging cases of housing innovation around the world that could serve as models for effectively adopting more space-efficient homes of the future.

"There is no question that if we are serious about embracing our expressed commitments to sustainability, we will in the future need to live more densely and wisely," said Maurie Cohen, the paper's author and professor at NJIT's Department of Humanities. "This will require a complete reversal in our understanding of what it means to enjoy a 'good life' and we will need to start with the centerpiece of the 'American Dream,' namely the location and scale of our homes.

"The notion of 'bigger is better' will need to be supplanted by the question of 'how much is enough?' Fortunately, we are beginning to see examples of this process unfolding in some countries around the world, including the United States."

Reimagining "Sufficient" Size of Sustainable Homes

Cohen's article explores the concept of "sufficiency limits" for the average contemporary home -- or, a rough baseline metric of "enough" living space to meet one's individual needs while considering various environmental and social factors, such as global resource availability and equitable material usage.

In the paper, Cohen reports that standardized building codes used in the United States and many other countries define minimally "sufficient" home size as 150 square feet for a single individual and 450 square feet for a four-person household.

However, from the standpoint of resource utilization and global equity, the maximally sufficient threshold is more significant.

Based on assessments of global resource availability and so-called total material consumption calculations developed by industrial ecologists and others, Cohen estimates that sustainability and equity considerations require that a home for a single person should be no larger than 215 square feet, and for a four-person family the maximum size should be approximately 860 square feet.

As a striking point of comparison, average home size in the U.S. today is 1,901 square feet -- more than twice what could be considered sustainable.

Applying these sufficiency limits in the real world would mean a radical departure from the mindset that is common today in the American homebuilding industry: large cathedral-ceiling foyers, expansive porches, spare bedrooms, extra dining rooms, and a fundamental rethink of the McMansion-style homes that line the cul-de-sacs of the country's suburbs in general. However, it could spur innovation in the design of more space-efficient homes, a trend gaining popularity particularly among younger generations.

"Lifestyle magazines and websites, television programs, and other media today regularly highlight the benefits of smaller homes," said Cohen. "One of the most popular contemporary design trends focuses on minimalism and especially Millennials express a desire to live in cosmopolitan urban centers rather than car-dependent suburbs. In some cities, micro-luxury apartments are becoming a fashionable alternative."

Along with making the critical transition toward greener technologies, Cohen says exploring sufficiency limits in the design of future homes would help to begin aligning infrastructure planning with global sustainability targets, and address two interrelated -- and in many ways perplexing -- trends in wealthy countries like the U.S. ongoing since the 1950s: home size has been increasing while household size has been declining.

Over the past seven decades, the average size of a newly built single-family home in the country nearly tripled from 983 square feet in 1950 to 2,740 square feet in 2015. Meanwhile, the average number of people per household has decreased 24% (3.3 persons to 2.52 persons) due to falling fertility rates and the fading of residential arrangements in which extended families lived under a single roof.

So, what would the average newly built U.S. home look like if architects and the building industry followed the numbers and adopted sufficiency limits?

In the U.S., average floor space per person would need to be reduced from 754 square feet to 215 square feet, which perhaps surprisingly, is roughly comparable to the amount of space available during the baby boom of the 1950s.

While Cohen acknowledges the myriad political, commercial and cultural challenges of imparting such a sufficiency ceiling on current housing practices, he highlights five examples that he asserts point to shifting sensibilities: the tiny-house movement in the United States; the niche market for substantially smaller houses and apartments in the Nordic countries; the construction of accessory dwelling units in west coast cities of North America; the growing popularity of micro-apartments in New York City and San Francisco; and the emergence of co-living/co-working facilities in Europe.

"Downsizing at such a radical scale may seem unrealistic today, but lifestyles are continually in flux and when looking back on our recent practices of spending such vast sums of money on overly large houses and creating vast separations between neighbors, thirty years from now we will in all likelihood be utterly dumbfounded," said Cohen. "The idea of spending endless hours mindlessly driving around in cars to reach houses with rooms that we rarely use, we can only hope, will become a faint memory."

Credit: 
New Jersey Institute of Technology

'Magnonic nanoantennas': optically-inspired computing with spin waves one step closer

Milan, 5 March 2020 - An article was published in the journal Advanced Materials, and will appear on the front cover of the March 5th issue, demonstrating a new methodology for generating and manipulating spin waves in nanostructured magnetic materials. This work opens the way to developing nano-processors for extraordinarily quick and energy efficient analog processing of information.

The discovery was the result of a collaboration among the magnetism group in the Physics Department at Politecnico di Milano, comprising Edoardo Albisetti, Daniela Petti and Riccardo Bertacco, the Elisa Riedo group (New York University Tandon School of Engineering), Silvia Tacchi of the Istituto Officina dei Materiali of the Italian National Research Council (CNR-IOM) in Perugia, the Physics and Geology Department at University of Perugia, and the PolLux Beamline at PSI (Villigen, Switzerland).

Spin waves, also known as "magnons", are analogous to electromagnetic waves for magnetism, and propagate in materials such as iron in a way that is similar to that of waves in the ocean. Compared to electromagnetic waves, magnons are characterised by unique properties that make them ideal for developing miniaturised "analog" computing systems that will be much more efficient than the digital systems currently available.

Until now, modulating spin waves at will was extremely complex. The article published in Advanced Materials presents a new type of emitters, called "magnonic nanoantennas", which allow for the generation of spin waves with controlled shape and propagation. For example, it is possible to obtain radial wavefronts (such as those generated by throwing a stone into a pool of water), or planar wavefronts (as ocean waves on the beach), as well as creating focused directional beams. The article also demonstrates that, by using multiple nanoantennas simultaneously, interference figures can be generated "on command", which is a necessary condition for developing analog computing systems.

The nanoantennas were realized by employing the TAM-SPL technique (developed at Politecnico di Milano in collaboration with Prof. Riedo's group), which allows for the manipulation of the magnetic properties of the materials at the nanoscale. Specifically, the nanoantennas consist of minuscule "ripples" in the magnetisation of the material (called "domain walls" and "vortices") that, when set in motion by an oscillating magnetic field, emit spin waves. Given that the spin waves' properties are linked to the type and peculiar characteristics of these ripples, controlling them very accurately allowed to modulate the emitted waves as never before.

Credit: 
NYU Tandon School of Engineering

Terahertz radiation technique opens a new door for studying atomic behavior

image: A compressor using terahertz radiation to shorten electron bunches is small enough to fit into the palm of a hand.

Image: 
Dawn Harmer/SLAC National Accelerator Laboratory

Researchers from the Department of Energy's SLAC National Accelerator Laboratory have made a promising new advance for the lab's high-speed "electron camera" that could allow them to "film" tiny, ultrafast motions of protons and electrons in chemical reactions that have never been seen before. Such "movies" could eventually help scientists design more efficient chemical processes, invent next-generation materials with new properties, develop drugs to fight disease and more.

The new technique takes advantage of a form of light called terahertz radiation, instead of the usual radio-frequency radiation, to manipulate the beams of electrons the instrument uses. This lets researchers control how fast the camera takes snapshots and, at the same time, reduces a pesky effect called timing jitter, which prevents researchers from accurately recording the timeline of how atoms or molecules change.

The method could also lead to smaller particle accelerators: Because the wavelengths of terahertz radiation are about a hundred times smaller than those of radio waves, instruments using terahertz radiation could be more compact.

The researchers published the findings in Physical Review Letters on February 4.

A Speedy Camera

SLAC's "electron camera," or ultrafast electron diffraction (MeV-UED) instrument, uses high-energy beams of electrons traveling near the speed of light to take a series of snapshots - essentially a movie - of action between and within molecules. This has been used, for example, to shoot a movie of how a ring-shaped molecule breaks when exposed to light and to study atom-level processes in melting tungsten that could inform nuclear reactor designs.

The technique works by shooting bunches of electrons at a target object and recording how electrons scatter when they interact with the target's atoms. The electron bunches define the shutter speed of the electron camera. The shorter the bunches, the faster the motions they can capture in a crisp image.

"It's as if the target is frozen in time for a moment," says SLAC's Emma Snively, who spearheaded the new study.

For that reason, scientists want to make all the electrons in a bunch hit a target as close to simultaneously as possible. They do this by giving the electrons at the back a little boost in energy, to help them catch up to the ones in the lead.

So far, researchers have used radio waves to deliver this energy. But the new technique developed by the SLAC team at the MeV-UED facility uses light at terahertz frequencies instead.

Why terahertz?

A key advantage of using terahertz radiation lies in how the experiment shortens the electron bunches. In the MeV-UED facility, scientists shoot a laser at a copper electrode to knock off electrons and create beams of electron bunches. And until recently, they typically used radio waves to make these bunches shorter.

However, the radio waves also boost each electron bunch to a slightly different energy, so individual bunches vary in how quickly they reach their target. This timing variance is called jitter, and it reduces researchers' abilities to study fast processes and accurately timestamp how a target changes with time.

The terahertz method gets around this by splitting the laser beam into two. One beam hits the copper electrode and creates electron bunches as before, and the other generates the terahertz radiation pulses for shortening the electron bunches. Since they were produced by the same laser beam, electron bunches and terahertz pulses are now synchronized with each other, reducing the timing jitter between bunches.

Down to the femtosecond

A key innovation for this work, the researchers say, was creating a particle accelerator cavity, called the compressor. This carefully machined hunk of metal is small enough to sit in the palm of a hand. Inside the device, terahertz pulses shorten electron bunches and give them a targeted and effective push.

As a result, the team could compress electron bunches so they last just a few tens of femtoseconds, or quadrillionths of a second. That's not as much compression as conventional radio-frequency methods can achieve now, but the researchers say the ability to simultaneously lower jitter makes the terahertz method promising. The smaller compressors made possible by the terahertz method would also mean lower cost compared to radio-frequency technology.

"Typical radio-frequency compression schemes produce shorter bunches but very high jitter," says Mohamed Othman, another SLAC researcher on the team. "If you produce a compressed bunch and also reduce the jitter, then you'll be able to catch very fast processes that we've never been able to observe before."

Eventually, the team says, the goal is to compress electron bunches down to about a femtosecond. Scientists could then observe the incredibly fast timescales of atomic behavior in fundamental chemical reactions like hydrogen bonds breaking and individual protons transferring between atoms, for example, that aren't fully understood.

"At the same time that we are investigating the physics of how these electron beams interact with these intense terahertz waves, we're also really building a tool that other scientists can use immediately to explore materials and molecules in a way that wasn't possible before," says SLAC's Emilio Nanni, who led the project with Renkai Li, another SLAC researcher. "I think that's one of the most rewarding aspects of this research."

This project was funded by DOE's Office of Science. The MeV-UED instrument is part of SLAC's Linac Coherent Light Source, a DOE Office of Science user facility.

SLAC is a vibrant multiprogram laboratory that explores how the universe works at the biggest, smallest and fastest scales and invents powerful tools used by scientists around the globe. With research spanning particle physics, astrophysics and cosmology, materials, chemistry, bio- and energy sciences and scientific computing, we help solve real-world problems and advance the interests of the nation.

SLAC is operated by Stanford University for the U.S. Department of Energy's Office of Science. The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time.

Credit: 
DOE/SLAC National Accelerator Laboratory

Using holograms helps in studying the quality of composite materials

Recently, an article was published in the IEEE Transactions on Industrial Informatics scientific journal, where the method of digital holography for assessing the parameters of composite materials was described. In particular, materials with ceramic spraying, which significantly increase resistance to temperature influences. And the spraying is carried out by a plasma jet at a very high temperature. After cooling the material, residual stresses arise in it, which can affect its strength characteristics. Thus, it is necessary to create a method for recording and evaluating such deformed states immediately after the spraying is completed.

One of the study authors, Head of the IKBFU Coherent-optical Measuring Systems Laboratory, Igor Alekseenko told us:

"There is a well-known and affordable method of controlling such stresses. A special sensor is attached to the sample, then a technological hole is made at one point, after which the residual stresses are released and recorded, and then analyzed. This method has two drawbacks. The first is that the sensor is attached to the sample, distorts the results. The second is that using this method it is possible to study the material at only a few points. In our work, we used the method of optical non-destructive testing, which, firstly, is non-contact, that is, localized sensors do not need to be installed on the material, and, secondly, it allows us to register strains at every point of the image. A sophisticated and highly sensitive measuring technique registers several hundreds of digital holograms, which make it possible to study mechanical processes in time on a set of points on the surface of a sample. The result is more accurate and complete information about the state of the object. At the first stage, the studies were conducted mainly by our German colleagues from the Institute of Technical Optics at the University of Stuttgart. Then the research was already carried out together with the specialists of the laboratory of Coherent-optical Measuring Systems of the IKBFU, whose employees have extensive experience in the field of digital holography and optical non-destructive testing"

According to Igor Alekseenko, the implementation of the adapted control method will improve the production process of composite materials and significantly improve their characteristics.

Credit: 
Immanuel Kant Baltic Federal University

New software tool fosters quality control of genome-scale models of metabolism

image: New software tool fosters quality control of genome-scale models of metabolism

Image: 
Nikolaus Sonnenschein

The application areas of genome-scale metabolic models are widespread, ranging from designing cell factories, investigating cancer metabolism, to analysing how microbes interact within our guts. Hence, the number of publications of manually and automatically generated models has been growing every year. This could be considered solely as a positive matter, but a lot of the data are very difficult for others to reproduce and reuse in different contexts.

Therefore, scientists from The Novo Nordisk Foundation Center for Biosustainability (DTU Biosustain) and a big group of researchers from all across the globe working within the field of biotechnology set out to address the problem.

In a new study published in Nature Biotechnology, they present the tool Memote which is a community-maintained, standardised set of metabolic model tests. The tests cover a range of aspects from annotations to conceptual integrity and can be extended to include experimental datasets for automatic model validation.

Thus, the hope is that Memote will allow both scientists and biotech companies to develop better performing models more efficiently. From a sustainability perspective, this will enable cell engineers to design and engineer better cell factories faster than currently possible since engineering the microbes for the production of biochemicals is a very time-consuming and expensive process at the moment.

Quality assurance in high demand

"If you work with model organisms such as yeast or the gut microbe Escherichia coli you are lucky because models for those organisms are quite predictive and have been refined over many years. But people in the community have been aware that a number of published models contain significant flaws," says Christian Lieven, who is a former postdoc at DTU Biosustain and now CSO and Co-founder of Unseen Bio.

Co-author Moritz Beber adds: "We conducted a quantitative assessment of thousands of published models. While the majority of them are in a reasonable state, memote was able to reveal specific problems in all of the models. This study underlines the utility of memote to assess and improve metabolic models".

Promotes openness and collaboration

Another issue that Memote addresses by applying best practices from the field of Software Engineering, is the continuous improvement and versioning of models before and after publication. Memote integrates with modern IT technologies such as the social coding platform GitHub and allows researchers to collaboratively improve models while making sure their quality never drops.

"Keeping a track record of model development is absolutely essential for both attributing credit but also for facilitating accountability. Furthermore, a lot of biological knowledge is uncovered from publications and documented in the process of reconstructing genome-scale metabolic models. Who knows, given the fact that reconstructions are knowledge-bases, maybe Memote might even provide the means of publishing detailed reviews about organisms' metabolisms in the future," says Bernhard Palsson, co-author and CEO of DTU Biosustain.

Better cell factory engineering

Memote enables a quick comparison of any two given models to assess which one is suited best for the selected host organism. A model is tested for a wide range of factors such as the production of biomass precursors, biomass consistency and growth rate. Ultimately, this should lead to a more rational approach to cell factory design.

"Today, we have so much data and knowledge about the biological pathways operating inside these industrial microorganisms. This enables us to use mathematical models to simulate the effects of genetic modifications and thus facilitate a more rational approach to the design of cell factories. I hope memote will facilitate the development of predictive models of many more organisms and thus also diversify the spectrum of available production hosts in cell factory engineering." says corresponding author and Associate Professor at DTU Bioengineering, Nikolaus Sonnenschein.

Credit: 
Technical University of Denmark

Illness won't stop vampire bat moms from caring for their offspring

COLUMBUS, Ohio - A study of social interactions among vampire bats that felt sick suggests family comes first when illness strikes - and may help explain which social interactions are most likely to contribute to disease transmission.

Most social interactions for bats that felt sick diminished, but vampire bat moms maintained close social connections with their kin. They continued to groom their offspring even if the youngsters seemed sick, and mothers that felt sick also kept grooming their healthy offspring.

"Moms everywhere are probably not surprised to find that vampire bat mothers were the most likely to potentially sacrifice their own health to tend to the needs of their offspring," said Gerald Carter, senior author of the study and assistant professor of evolution, ecology and organismal biology at The Ohio State University.

Carter said the family connection evident in this study bears out what was seen with the initial COVID-19 coronavirus outbreak in China: The disease was spreading mainly within family groups because these social connections will not be reduced when people are sick.

This study examined grooming and food sharing among vampire bats after some were injected with a substance that activated their immune system and made them feel sick for several hours, but didn't infect them with an actual illness.

Social grooming among unrelated bats that felt sick decreased, but food sharing continued at normal levels - suggesting a social behavior necessary for survival will persist despite the risk for disease and that family members are likely at highest risk of catching another's illness.

"To take a human analogy, a sick person might completely stop shaking hands, but they might still engage in more necessary social interactions - like preparing food for their families," Carter said.

The research is published online in the Journal of Animal Ecology.

Vampire bats are highly social animals that huddle closely in their dark, cave-like roosts and, according to previous research led by Carter, show signs of maintaining "friendships" in the wild.

The experiments were conducted in a captive colony of 36 vampire bats - 24 adult females and their 12 captive-born offspring. Such a combination of unrelated mothers mixed with their offspring is common in the wild, said Carter, also a research associate at the Smithsonian Tropical Research Institute (STRI) in Panama, where this study took place.

The project was led by first author Sebastian Stockmaier, a graduate student at the University of Texas at Austin, and advised by Carter and co-authors Daniel Bolnick of the University of Connecticut and Rachel Page of STRI.

Stockmaier injected each bat with both the molecule causing the immune challenge or a placebo in random order and observed the frequency and types of behavior to determine whether they decreased or remained consistent.

He observed that sick bats had fewer grooming partners - but the grooming that did occur (chewing and licking another bat's body) was equally intense among sick and healthy pairs of bats.

Food sharing, on the other hand, remained constant. Vampire bats commonly regurgitate their blood meals to feed their hungry counterparts, and donor bats continued that behavior even if the hungry bats appeared to be sick.

"We all know that more socially connected people are more likely to transmit disease to each other and that when sick people are less social, that can slow down how a pathogen spreads," Carter said. "This study demonstrates the type of social connection makes a difference in what social behavior is likely to continue even when people are sick."

Based on their observations, the researchers attributed changes in the sick vampire bats' social behavior to feeling lethargic rather than as an infection-control strategy.

"It's not uncommon for humans who are sick to isolate themselves or avoid each other to prevent disease transmission," Carter said. "For vampire bats, the instinct to stay strong through social interaction likely outweighs the potential benefits of preventing infections among others in their roost."

Credit: 
Ohio State University

Veterinarians: Dogs, too, can experience hearing loss

image: A puppy is exhibiting signs of impaired hearing. He does not respond to audible commands unless he can see the person making the commands. His owners brought him to the veterinary clinic at the University of Illinois at Urbana-Champaign for testing.

Image: 
Photo by L. Brian Stauffer

CHAMPAIGN, Ill. -- Just like humans, dogs are sometimes born with impaired hearing or experience hearing loss as a result of disease, inflammation, aging or exposure to noise. Dog owners and K-9 handlers ought to keep this in mind when adopting or caring for dogs, and when bringing them into noisy environments, says Dr. Kari Foss, a veterinary neurologist and professor of veterinary clinical medicine at the University of Illinois at Urbana-Champaign.

In a new report in the journal Topics in Companion Animal Medicine, Foss and her colleagues describe cases of hearing loss in three working dogs: a gundog, a sniffer dog and a police dog. One of the three had permanent hearing loss, one responded to treatment and the third did not return to the facility where it was originally diagnosed for follow-up care.

The case studies demonstrate that those who work with police or hunting dogs "should be aware of a dog's proximity to gunfire and potentially consider hearing protection," Foss said. Several types of hearing protection for dogs are available commercially.

Just as in humans, loud noises can harm the delicate structures of a dog's middle and inner ear.

"Most commonly, noise-induced hearing loss results from damage to the hair cells in the cochlea that vibrate in response to sound waves," Foss said. "However, extreme noise may also damage the eardrum and the small bones within the inner ear, called the ossicles."

Pet owners or dog handlers tend to notice when an animal stops responding to sounds or commands. However, it is easy to miss the signs, especially in dogs with one or more canine companions, Foss said.

"In puppies with congenital deafness, signs may not be noticed until the puppy is removed from the litter," she said.

Signs of hearing loss in dogs include failing to respond when called, sleeping through sounds that normally would rouse them, startling at loud noises that previously didn't bother them, barking excessively or making unusual vocal sounds, Foss said. Dogs with deafness in one ear might respond to commands but could have difficulty locating the source of a sound.

If pet owners think their pet is experiencing hearing loss, they should have the animal assessed by a veterinarian, Foss said. Hearing loss that stems from ear infections, inflammation or polyps in the middle ear can be treated and, in many cases, resolved.

Hearing-impaired or deaf dogs may miss clues about potential threats in their surroundings, Foss said.

"They are vulnerable to undetected dangers such as motor vehicles or predators and therefore should be monitored when outside," she said.

If the hearing loss is permanent, dog owners can find ways to adapt, Foss said.

"Owners can use eye contact, facial expressions and hand signals to communicate with their pets," she said. "Treats, toy rewards and affection will keep dogs interested in their training." Blinking lights can be used to signal a pet to come inside.

Hearing loss does not appear to affect dogs' quality of life, Foss said.

"A dog with congenital hearing loss grows up completely unaware that they are any different from other dogs," she said. "Dogs that lose their hearing later in life may be more acutely aware of their hearing loss, but they adapt quite well. A dog's life would be significantly more affected by the loss of smell than by a loss of hearing."

Credit: 
University of Illinois at Urbana-Champaign, News Bureau