Tech

Research reveals structure of nanomachine that assembles a cell's energy control system

Researchers from the University of Sussex have determined the structure of a tiny multi-protein biological machine, furthering our understanding of human cells and helping to enhance research into cancer, neurodegeneration and other illnesses.

A biological nanomachine is a macromolecular machine commonly found within the cell, often in the form of multi-protein complexes, which frequently perform tasks essential for life.

The nanomachine R2TP-TTT acts as a molecular chaperone to assemble others in the human cell. It is especially important for constructing mTORC1 - a complicated nanomachine that regulates the cell's energy metabolism, and which often becomes misregulated in human diseases such as cancer and diabetes.

Scientists from the School of Life Sciences at Sussex, working in collaboration with colleagues at CNIO Madrid, MRC-LMB Cambridge and the University of Leeds, used state-of-the-art cryo-electron microscopy (cryoEM) to build a detailed image of the R2TP-TTT nanomachine that shows the arrangement of all the proteins. It also reveals how the TTT proteins control the R2TP machine to allow it to hold components of mTORC1 ready for assembly.

Lead researcher, Dr. Mohinder Pal, working in the laboratories of Dr. Chris Prodromou and Professor Laurence Pearl FRS at Sussex, worked out how to make and purify all the proteins using an insect cell system, and apply them in an ultra-thin layer that could be frozen in liquid ethane to preserve their atomic structure. Images of the frozen protein particles magnified more than 50,000 times were then collected on cryo-electron microscopes in Madrid, Harwell and Leeds. These were then combined using a technology related to medical tomography, to give the final detailed image of the R2TP-TTT, in which the molecular detail could be seen and analysed.

Professor Pearl, who co-supervised the work with Dr. Prodromou and Prof. Llorca (Madrid), commented :

"Previously we've been able to work out the structures of protein molecules, using a technique called X-ray crystallography, but usually only individually or in pieces. The revolution in cryoEM technology over the last couple of years has given us the ability to look at the large assemblies of proteins as they actually exist in the cell, and really understand how they work as biological nanomachines."

With the help of the RM Phillips Charitable Trust, the University of Sussex has made a multi-million pound investment to establish cryo-electron microscopy in the School of Life Sciences. The new state-of-the-art cryoARM200 cryo-electron microscope, made by the Japanese company JEOL, has just been installed in the John Maynard Smith building at the University, and will be fully functioning in the summer.

Professor Pearl said :

"Having our own instrument on site, will greatly increase the speed with which we can reveal the structures of a huge range of biological nanomachines being studied by colleagues in Life Sciences. This will massively enhance the world-leading work going on here at Sussex to understand cancer, neurodegeneration and viral diseases, and to develop new treatments".

Credit: 
University of Sussex

UCF study finds smaller turtles are nesting on Florida beaches

ORLANDO, July 8, 2021 - A new University of Central Florida study indicates that smaller loggerhead and green sea turtles are nesting on Florida beaches than in the past; however, researchers aren't sure why.

The findings, published this month in the journal Ecosphere, give clues to the status of the turtles, which is important to researchers who are monitoring the population health of the threatened species.

Central Florida's Atlantic coastline hosts about one-third of all green turtle nests in the state and is one of the most important nesting areas in the world for loggerheads.

Sea turtles are important as iconic symbols of conservation in Florida and for the role they play in maintaining a healthy ocean ecosystem.

The reason for the appearance of smaller nesting turtles is still a mystery though, says Katrina Phillips, the study's lead author and a doctoral candidate in UCF's Department of Biology.

"It might be that juvenile turtles are growing more slowly because they are having a harder time finding food as a result of habitat degradation or competition from other turtles," Phillips says. "Or smaller turtles may also be new recruits to the population as a result of successful sea turtle conservation efforts. We don't know why we're seeing more small turtles nesting."

The researchers made the discovery by comparing shell length of nearly 10,000 nesting female loggerheads and more than 3,000 nesting female green turtles. The measurements were collected by UCF's Marine Turtle Research Group over the course of a 37-year period, from 1982 to 2019.

The nesting turtles were observed in the Brevard County portion of the Archie Carr National Wildlife Refuge. Age is not recorded or known because it requires examining a cross-section of the turtle's leg bone, which would require invasively sampling the turtle, and even then, at best, age would be estimated.

The researchers found that the average size of nesting loggerheads decreased by nearly 1 inch and the average size of nesting green sea turtles decreased by more than 1.5 inches since 1982.

In addition to raising questions about why the turtles are smaller, the findings also mean that when estimating female sea turtle maturity based on size, researchers and management agencies will need to consider smaller turtles in their estimates.

"The numbers we provide for the minimum size range of mature females will help other groups who study turtles in the water, where it's not clear if they are mature or not, better estimate which of theirs are juveniles," she says.

The extensive study was made possible by the long-time work of UCF's Marine Sea Turtle Research Group, the researcher says.

"Many nesting beach projects take these measurements, but the UCF project is unique because of how long it's been going on and how many turtles come ashore to nest here," Phillips says. "Florida gets more loggerhead nests than anywhere else in the world, and the green turtle nest numbers are rising."

The monitoring project was started in 1982 by UCF Professor Emeritus and Pegasus Professor of Biology Llewellyn "Doc" Ehrhart.

Phillips says the UCF Marine Turtle Research Group will continue monitoring the nesting sites, which will allow researchers to assess if the trends continue or change.

Credit: 
University of Central Florida

Study identifies genetic risks for suicide death in individuals with bipolar disorder

A new study shows that individuals with bipolar disorder who are exposed to significant trauma may be at greater risk for suicide death, suggesting that clinical diagnosis of or genetic predisposition to trauma-related conditions could be important factors to consider in suicide prevention.

Suicide is the 10th leading cause of death in the United States, accounting for over 40,000 deaths each year, and suicide death rates are 10 to 30 times higher for people with bipolar disorder than for the general population.

The research, spearheaded by Eric Monson, MD, PhD, and Hilary Coon, PhD, from the University of Utah, in collaboration with Virginia Willour, PhD, from the University of Iowa, set out to identify unique risk factors for suicide attempt and death within bipolar disorder.

"There are many factors that go into increased risk for suicide--genetics is one of them," says Willour, a professor of psychiatry in the UI Carver College of Medicine. "We want to understand what the risk factors are so we can move forward with better interventions and decrease the rates of suicide."

The team's findings were recently published in a research paper, titled "Assessment of suicide attempt and death in bipolar affective disorder: a combined clinical and genetic approach," in the journal Translational Psychiatry.

Monson, first author on the study and a former graduate student of Willour's, says this research is a first meaningful assessment of risk factors that are specific not just to suicide attempt but to suicide death.

"Even though it does not provide a definitive answer, this work provides information that supports the idea that the risk factors for suicide attempt and suicide death may differ from one another," Monson says. "And ongoing research is going to be really critical to make sure we make the best use of valuable resources to prevent suicide."

Primary results of the study demonstrate that diagnosis of trauma-associated disorders, such as post-traumatic stress disorder (PTSD), are much more frequent in individuals with bipolar disorder who died from suicide than within all other groups--including suicide attempt.

The researchers' analysis also demonstrates a genetic predisposition to developing PTSD within individuals with bipolar disorder who died from suicide.

Additionally, the findings suggest that PTSD genetic risk factors derived from males were found more frequently in individuals with bipolar disorder who died from suicide, but genetic risk factors derived from females were associated with both suicide death and attempt.

Understanding how genetic variation contributes to suicide risk can help identify different strategies or potential medications to bring relief to patients at greatest risk of suicide.

"This is not a job to us--it's not even a career," says Willour, a senior author on the study and a member of the Iowa Neuroscience Institute. "This is a mission, to decrease suicide rates and do it in a way that brings relief to the patient as soon as possible."

Funded primarily by a grant from the American Foundation for Suicide Prevention, the study is the largest combined clinical and genetic effort to investigate risk factors for suicide death in bipolar disorder and uses the single largest sample of individuals who have died from suicide in the world.

"There have been decades of work that have gone into preparing this data," Monson says. "We would have no access to data of this caliber, of these numbers, if it weren't for the efforts of collaboration. There have been hundreds of investigators, and thousands of individuals who have donated their time, their DNA samples--all of these different things to make this possible."

It is critical to identify these potential risk factors because death from suicide is inherently preventable, and any tools to better predict those at greatest risk may aid in leveraging highly limited mental-health resources to reach those who need it most.

"Suicide is preventable--that doesn't get said enough," Monson says. "That's why screening is so important, and that's why all these steps of research that we take really matter. When you have something that is the worst possible outcome for an illness but is completely preventable--we have to do something about that."

Credit: 
University of Iowa Health Care

First study of nickelate's magnetism finds a strong kinship with cuprate superconductors

image: The first measurements of magnetic excitations rippling through a nickelate superconductor show it has a strong kinship with cuprate superconductors, like the one at left, as opposed to the more distant neighborly relationship depicted at right. The study by researchers at SLAC, Stanford and Diamond Light Source revealed important similarities and subtle differences between the two materials, which conduct electricity with no loss at relatively warm temperatures.

Image: 
Greg Stewart/SLAC National Accelerator Laboratory

Ever since the 1986 discovery that copper oxide materials, or cuprates, could carry electrical current with no loss at unexpectedly high temperatures, scientists have been looking for other unconventional superconductors that could operate even closer to room temperature. This would allow for a host of everyday applications that could transform society by making energy transmission more efficient, for instance.

Nickel oxides, or nickelates, seemed like a promising candidate. They're based on nickel, which sits next to copper on the periodic table, and the two elements have some common characteristics. It was not unreasonable to think that superconductivity would be one of them.

But it took years of trying before scientists at the Department of Energy's SLAC National Accelerator Laboratory and Stanford University finally created the first nickelate that showed clear signs of superconductivity.

Now SLAC, Stanford and Diamond Light Source researchers have made the first measurements of magnetic excitations that spread through the new material like ripples in a pond. The results reveal both important similarities and subtle differences between nickelates and cuprates. The scientists published their results in Science today.

"This is exciting, because it gives us a new angle for exploring how unconventional superconductors work, which is still an open question after 30-plus years of research," said

Haiyu Lu, a Stanford graduate student who did the bulk of the research with Stanford postdoctoral researcher Matteo Rossi and SLAC staff scientist Wei-Sheng Lee.

"Among other things," he said, "we want to understand the nature of the relationship between cuprates and nickelates: Are they just neighbors, waving hello and going about their separate ways, or more like cousins who share family traits and ways of doing things?"

The results of this study, he said, add to a growing body of evidence that their relationship is a close one.

Spins in a checkerboard

Cuprates and nickelates have similar structures, with their atoms arranged in a rigid lattice. Both come in thin, two-dimensional sheets that are layered with other elements, such as rare-earth ions. These thin sheets become superconducting when they're cooled below a certain temperature and the density of their free-flowing electrons is adjusted in a process known as doping.

The first superconducting nickelate was discovered in 2019 at SLAC and Stanford. Last year, the same SLAC/Stanford team that performed this latest experiment published the first detailed study of the nickelate's electronic behavior. That study established that in undoped nickelate, electrons flow freely in nickel oxide layers, but electrons from the intervening layers also contribute electrons to the flow. This creates a 3D metallic state that's quite different from what is seen in cuprates, which are insulators when undoped.

Magnetism is also important in superconductivity. It's created by the spins of a material's electrons. When they're all oriented in the same direction, either up or down, the material is magnetic in the sense that it could stick to the door of your fridge.

Cuprates, on the other hand, are antiferromagnetic: Their electron spins form a checkerboard pattern, so each down spin is surrounded by up spins and vice versa. The alternating spins cancel each other out, so the material as a whole is not magnetic in the ordinary sense.

Would nickelate have those same characteristics? To find out, researchers took samples of it to the Diamond Light Source synchrotron in the UK for examination with resonant inelastic X-ray scattering, or RIXS. In this technique, scientists scatter X-ray light off a sample of material. This injection of energy creates magnetic excitations - ripples that travel through the material and randomly flip the spins of some of its electrons. RIXS allows scientists to measure very weak excitations that couldn't be observed otherwise.

Creating new recipes

"What we find is quite interesting," Lee said. "The data show that nickelate has the same type of antiferromagnetic interaction that cuprates have. It also has a similar magnetic energy, which reflects the strength of the interactions between neighboring spins that keep this magnetic order in place. This implies that the same type of physics is important in both."

But there are also differences, Rossi noted. Magnetic excitations don't spread as far in nickelates, and die out more quickly. Doping also affects the two materials differently; the positively charged "holes" it creates are concentrated around nickel atoms in nickelates and around oxygen atoms in cuprates, and this affects how their electrons behave.

As this work continues, Rossi said, the team will test how doping the nickelate in various ways and swapping different rare earth elements into the layers between the nickel oxide sheets affect the material's superconductivity ­- paving the way, they hope, to discovery of better superconductors.

Credit: 
DOE/SLAC National Accelerator Laboratory

Many nonprofits, companies report using commercial species in tree planting projects

Nonprofits and companies planting trees in the tropics may often pick species for their commercial rather than ecological value, researchers found in a new analysis of organizations' publicly available data. They also found many may not have tracked the trees' survival.

Tree planting is a promising, but controversial, restoration strategy for fighting climate change. A new study in the journal Biological Conservation provides a detailed look at what restoration organizations across the tropics are actually doing on the ground.

"We found some organizations placed an emphasis on biological diversity and forest restoration in their mission statements. When we looked at the species they reported planting, many organizations reported planting commercial species, with chocolate, mango and teak in the top five," said the study's first author Meredith Martin, assistant professor of forestry and environmental resources at NC State. Martin led the study with researchers from The Nature Conservancy, a nonprofit that was also included in the analysis.

For the study, researchers analyzed publicly available data from websites and annual reports for 136 nonprofits and 38 for-profit companies, gathered using internet searches and referrals from Yale University's Environmental Leadership and Training Initiative. Their analysis included projects focused on forest conservation, economic development, or humanitarian aims in 74 different countries, all located in the tropics or subtropics. Brazil, Kenya and Indonesia were the countries with the largest number of projects.

Of these organizations, 118 reported the numbers of trees they planted. In total, they reported planting a total of 1.4 billion trees since 1961. At their estimated average rate of planting in the tropics, it would take more than a thousand years to plant a trillion trees - a goal set by at least three global initiatives.

Organizations reported planting a total of 682 species - a fraction of the roughly 50,000 species of trees found in the tropics, Martin said. Without having access to data about numbers of trees planted by species, they estimated the percentage of organizations planting certain species. The most commonly reported species, ranked by number of projects reporting those species, were cacao, teak, moringa, mango and coffee.

Nearly half of the groups didn't mention their planting method. The most common planting method was agroforestry, which is the integration of trees into animal or crop farming operations. Ten percent talked about planting using assisted regeneration, seven percent focused on enrichment planting, and two percent focused on natural regeneration.

"There's been a lot of research looking at natural regeneration, which is protecting a forest and letting it regrow," Martin said. "It can be cheaper, and more effective in terms of accumulating biomass and species diversity. There are also ways of assisting regeneration to encourage the species you want."

Thirty-two individual organizations mentioned monitoring tree survival. Of those, eight mentioned measuring survival rates and seven mentioned maintenance of plantings. Three groups gave detailed information about monitoring, and two mentioned they worked with outside groups for monitoring or certification.

"If you're not monitoring whether the trees you're planting are surviving, or taking steps to ensure they're surviving or growing, that could be a waste of money and effort," Martin said.

Researchers say the findings are important as groups increasingly look to plant trees to mitigate climate change.

"Trees are natural and incredibly efficient carbon capture entities," Martin said. "They're also living organisms. They're not just machines we can put down anywhere. Organizations need to be thoughtful about what species they are going to use and how they make sure they match the environment, as well as tracking to make sure they're not wasting money on something that doesn't work."

Credit: 
North Carolina State University

FEWSION: Creating more resilient supply chains through nature-inspired design

image: Illustration showing how building diversity into supply chains can help buffer cities against supply chain disruptions

Image: 
Northern Arizona University

A new paper in Nature lays out the way natural ecosystems parallel U.S. supply chains and how American cities can use these tools to strengthen their supply chains.

The paper, "Supply chain diversity buffers cities against food shocks," is co-authored by Benjamin Ruddell, director of the FEWSION Project and the School of Informatics, Computing, and Cyber Systems at Northern Arizona University, and Richard Rushforth, an assistant research professor in SICCS, in collaboration with FEWSION project researchers at Penn State. FEWSION is an NSF-funded collaboration that uses comprehensive data mapping to monitor domestic supply chains for food, water and energy down to the county level.

This research looks at the importance of diversity within the supply chain, which helps to reduce damaging disruptions from supply chain shocks. Supply chains work a lot like food webs in natural ecosystems, in which biodiversity allows for adaptation during disruptions. The analogy turned out to be incredibly insightful, particularly in looking at "black swan" events, which are unpredictable and hard to protect against--and for which adaptation, not prevention, is the main defense.

"This is why ecological theory is so important--if we have diverse supply chains that mimic ecological systems, they can more readily adapt to unforeseeable shocks," Ruddell said. "We can use this nature-inspired design to create more resilient supply chains."

The study examined a history of food flow data for U.S. cities, questioning whether the diversity of a city's food supply chain explains the resilience of the city's food supply to shocks. They found that the diversity of a city's supply chain explains more than 90 percent of the intensity, duration and frequency of historically observed food supply shocks in US cities.

This model worked regardless of the cause of the shock, which Ruddell said is both profound and practical.

"We now have a simple and effective mathematical basis for policies to boost a city's supply chain resilience," he said. "Coming years will reveal how broadly this finding holds for other kinds of supply chains. Does it hold for households? Nations? Electricity? Telecommunications?"

This practical solution can help leads communities to develop resilient supply chains and better supply chain policy. It is particularly important in today's economy, as the United States has seen several serious threats to critical supply chains nationwide in the last 18 months, including the global COVID-19 supply chain crunch, the Colonial Pipeline and JBS meat processing ransomware attacks, the Suez Canal blockage, the hack of water supplies in Florida and the ERCOT power and water outage. It also is significant as the country heads into hurricane season, as several major ports often are in the direct line of damaging storms.

Additionally, international events have led weakened supply chains in the course of several decades, including the globalization of supply chains, causing an increased reliance on manufacturers in Asia; just-in-time manufacturing and distribution, which leads to reduced inventories; and global price competition, which has concentrated production to one or two huge producers. The Biden Administration has recognized supply chain resilience as a major priority and is setting national goals, which Ruddell said is a step in the right direction, but communities can proactively strengthen their own supply chains as well to help prepare for disaster.

"This finding is also promising for the intellectual synthesis of data science, network theory and ecologically inspired (or nature-based) resilience thinking and argues for further investment in the basic research that led to this practical breakthrough," Ruddell said. "This method can underpin a new class of federal regulations for critical supply chain security and can be used to make our economy more secure and resilient. We're excited to see where these ideas take us in coming years. Resilience, based on solid science, is excellent policy for an unpredictable 21st century."

Interested readers should visit the FEWSION Project website and explore the FEW-View tool, which maps the diversity and resilience of supply chains for U.S. cities.

Credit: 
Northern Arizona University

Ecologists compare accuracy of lidar technologies for monitoring forest vegetation

image: Researcher Jon Donager uses a handheld mobile lidar scanning (MLS) device

Image: 
Northern Arizona University

As light detection and ranging (lidar) technology evolves, forest ecology and ecological restoration researchers have been using these tools in a wide range of applications.

"We needed an accounting of relative accuracy and errors among lidar platforms within a range of forest types and structural configurations," said associate professor Andrew Sánchez Meador, executive director of NAU's Ecological Restoration Institute (ERI).

Sánchez Meador led a study recently published in Remote Sensing, "Adjudicating Perspectives on Forest Structure: How Do Airborne, Terrestrial, and Mobile Lidar-Derived Estimates Compare?." The study compared vegetation attributes at multiple scales derived from piloted airborne (ALS), fixed-location terrestrial (TLS) and mobile lidar scanning (MLS) to see how these tools might be used to provide detailed information on forest structure and composition. The researchers, including postdoctoral scholar Jonathon Donager and PhD student Ryan Blackburn, both of ERI and NAU's School of Forestry, found MLS consistently provided accurate structural metrics and can produce accurate estimates of canopy cover and landscape metrics.

"Our findings suggest that MLS has great potential for monitoring a variety of forest attributes," Sánchez Meador said. "These types of scanners cost a fraction of that of other platforms, work equally well indoors and outdoors, are easily deployed and view the forest the same way humans do - from down among the trees - which makes communicating research findings easier."

"As the technology develops further and prices continue to come down," he said, "we expect to see more researchers and managers using these tools for all sorts of applications, from monitoring the effects disturbance events such as fire and flood, to quantifying vital wildlife habitat, to providing baseline data for virtual reality applications and simulation modeling."

As a result of this work, Sánchez Meador and David Huffman, ERI director of research and development, secured funding from the Phoenix-based Salt River Project (SRP) to examine the ability of MLS to rapidly assess forest structural conditions in mixed-conifer forests and the amount and distribution of coarse woody debris, an important component of forest ecosystems.

This research was made possible through funding from NAU's Research Equipment Acquisition Program (REAP), which enabled ERI to purchase a handheld MLS device. This project shows how investment in technology and equipment through the REAP program can be leveraged to support broader, multiple research goals and promote partnerships with companies like SRP.

As ERI's executive director, Sánchez Meador works to advance the institute's focus on restoring western forest landscapes using innovative technologies, service to Native American tribes, promoting novel solutions for the use of tree biomass and wood products and actively engaging with the people and communities that influence land management and depend on these forests.

Credit: 
Northern Arizona University

Meta-analysis finds that omega-3 fatty acids improved cardiovascular outcomes

For decades, there has been great interest in whether omega-3 fatty acids can lower rates of cardiovascular events. In 2018, results from the Reduction of Cardiovascular Events with Icosapent Ethyl-Intervention Trial (REDUCE-IT) were published in the New England Journal of Medicine and showed that a high dose of a purified ethyl ester of eicosapentaenoic acid (EPA) in patients at elevated cardiac risk significantly reduced cardiovascular events. Results from the trial led to US. Food and Drug Administration, Health Canada, and European Medicines Agency approval of the prescription drug icosapent ethyl for reducing cardiovascular risk in patients with elevated triglycerides, as well as updates to worldwide guidelines. But prior and subsequent studies of omega-3 fatty acid supplements that combine EPA and docosahexaenoic acid (DHA) have had mixed results. Investigators from Brigham and Women's Hospital and elsewhere conducted a systematic review and meta-analysis of 38 randomized controlled trials of omega-3 fatty acids. Overall, they found that omega-3 fatty acids improved cardiovascular outcomes. Results, now published in eClinical Medicine, showed a significantly greater reduction in cardiovascular risk in studies of EPA alone rather than EPA+DHA supplements.

"REDUCE-IT has ushered in a new era in cardiovascular prevention," said senior author Deepak L. Bhatt, MD, MPH, the executive director of Interventional Cardiovascular Programs at the Brigham and lead investigator of the REDUCE-IT trial. "REDUCE-IT was the largest and most rigorous contemporary trial of EPA, but there have been other ones as well. Now, we can see that the totality of evidence supports a robust and consistent benefit of EPA."

Bhatt and colleagues performed a meta-analysis of 38 randomized clinical trials of omega-3 fatty acids, including trials of EPA monotherapy and EPA+DHA therapy. In total, these trials included more than 149,000 participants. They evaluated key cardiovascular outcomes, including cardiovascular mortality, non-fatal cardiovascular outcomes, bleeding, and atrial fibrillation. Overall, omega-3 fatty acids reduced cardiovascular mortality and improved cardiovascular outcomes. The trials of EPA showed higher relative reductions in cardiovascular outcomes compared to those of EPA+DHA.

The researchers note that there are crucial biological differences between EPA and DHA -- while both are considered omega-3 fatty acids, they have different chemical properties that influence their stability and strength of the effect that they can have on cholesterol molecules and cell membranes. No trials to date have studied the effects of DHA alone on cardiovascular outcomes.

"This meta-analysis provides reassurance about the role of omega-3 fatty acids, specifically prescription EPA," said Bhatt. "It should encourage investigators to explore further the cardiovascular effects of EPA across different clinical settings."

Credit: 
Brigham and Women's Hospital

Malaria and dengue predicted to affect billions more people

An estimated 8.4 billion people could be at risk from malaria and dengue by the end of the century if emissions keep rising at current levels, according to a new study published in The Lancet Planetary Health.

The research team estimates that this worst-case scenario would mean the population at risk of the diseases might increase by up to 4.7 additional billion people (relative to the period 1970-1999), particularly in lowlands and urban areas, if temperatures rise by about 3.7°C 1 by 2100 compared to pre-industrial levels.

The study was led by the London School of Hygiene & Tropical Medicine (LSHTM) with partners from Umeå University, Sweden; Abdus Salam International Centre for Theoretical Physics, Italy; University of Heidelberg, Germany; and the University of Liverpool.

The team used a range of models to measure the potential impact of climate change on the length of the transmission season and population at risk of two important mosquito-borne diseases - malaria and dengue - by the end of the 21st century compared with 100 years earlier. They made their predictions based on different levels of greenhouse gas emissions, population density (to represent urbanisation) and altitude.

For malaria, the modelling for the worst-case scenario estimated a total of 8.4 billion people being at risk in 2078 (ie 89.3% of an estimated global population of 9.4 billion) compared with an average of 3.7 billion over the period 1970-1999 (ie 75.6% of an estimated global population of 4.9 billion)2. For dengue, the modelling estimated a total of 8.5 billion people at risk in 2080 compared with an average of 3.8 billion in 1970-1999.

Malaria suitability is estimated to gradually increase as a consequence of a warming climate in most tropical regions, especially highland areas in the African region (eg Ethiopia, Kenya and South Africa), the Eastern Mediterranean region (eg Somalia, Saudi Arabia and Yemen), and the Americas (eg Peru, Mexico and Venezuela). Dengue suitability is predicted to increase mostly in lowland areas in the Western Pacific region (eg Guam, Vanuatu, Palau) and the Eastern Mediterranean region (eg Somalia and Djibouti), and in highland areas in the Americas (eg Guatemala, Venezuela and Costa Rica).

The research predicts there will be a northward shift of the malaria-epidemic belt in North America, central northern Europe, and northern Asia, and a northward shift of the dengue-epidemic belt over central northern Europe and northern USA because of increases in suitability.

All the scenarios predicted an overall increase in the population at risk of malaria and dengue over the century. However, the impact would reduce substantially if action were taken to reduce global emissions, according to the modelling.

In the scenario where emissions are reduced the most - greenhouse gas emissions decline by 2020 and go to zero by 2100 and global mean temperature increases by 1°C between 2081 and 2100 - an additional 2.35 billion people are predicted to be living in areas suitable for malaria transmission. For dengue in this scenario, the modelling suggests 2.41 additional billion people could be at risk.

The study highlighted that if emission levels continue to rise at current levels, tropical high-elevation areas (more than 1,000 metres above sea level) in areas such as Ethiopia, Angola, South Africa, and Madagascar could experience up to 1.6 additional climatically suitable months for malaria transmission in 2070-2099 compared with the period 1970-1999.

The study predicted that the length of the dengue transmission season could increase by up to four additional months in tropical lowland areas in south east Asia, sub-Saharan Africa, and the Indian sub-continent.

First author Dr Felipe J Colón-González, Assistant Professor at LSHTM, said: "Our results highlight why we must act to reduce emissions to limit climate change.

"This work strongly suggests that reducing greenhouse gas emissions could prevent millions of people from contracting malaria and dengue. The results show low-emission scenarios significantly reduce length of transmission, as well as the number of people at risk.
Action to limit global temperature increases well below 2°C must continue.

"But policymakers and public health officials should get ready for all scenarios, including those where emissions remain at high levels. This is particularly important in areas that are currently disease-free and where the health systems are likely to be unprepared for major outbreaks."

Climate change has increased concerns that mosquito-borne disease transmission will intensify through increased vector survival and biting rates, increased replication of pathogens within vectors, shorter reproduction rates, and longer transmission seasons.

Malaria and dengue, the most important mosquito-borne global threats3, are being found in more areas, gradually emerging in previously unaffected places, and re-emerging in places where they had subsided for decades. Malaria is shifting towards higher altitudes, and urbanisation is associated with increasing dengue risk.

While differential effects of climate change with altitude and urbanisation have been previously discussed, they have not been quantified globally for different levels of altitude and urbanisation, until now.

The research team's methods involved identifying the risk for each World Health Organization (WHO) region4 using six emission and socioeconomic scenario combinations ranging from conservative to business as usual5, and six disease models.

Although the worst-case scenario models indicated that some areas could become too hot for some mosquito species, that situation would likely cause other health effects such as heat-related mortality, reduced labour productivity, and reduced food production.

Moreover, mosquito-borne diseases could become a bigger problem elsewhere, including expanding further north and into higher altitude and temperate regions, as climatic conditions such as temperature and rainfall enable malaria and dengue to thrive in different parts of the world.

Senior author Dr Rachel Lowe, Associate Professor and Royal Society Dorothy Hodgkin Fellow at LSHTM, said: "A number of interventions will be needed to adapt to the health effects of a warmer and more urbanised world and to prepare for all scenarios.

"Our findings stress the importance of increased surveillance in potential hotspot areas to monitor the emergence of diseases, especially in places without previous experience of dengue or malaria.

"Public health action will be particularly important in areas where transmission is occasional because public health systems might be unprepared to control and prevent these diseases."

The authors acknowledge limitations in the study including the fact they did not consider the effects of socioeconomic development, disease and vector evolution, or the development of more effective drugs and vaccines, all of which could lead to important differences in the amount of risk simulated. The estimates are also constrained by the selection of climate and disease models, and the selected combinations of emission and socioeconomic scenarios. In future experiments, researchers could incorporate larger model ensembles and scenario combinations to provide a richer view of the uncertainty around the estimates.

Credit: 
London School of Hygiene & Tropical Medicine

New study helps explain 'silent earthquakes' along New Zealand's North Island

image: Map of the Hikurangi subduction zone and locations where electromagnetic receivers were deployed to collect data.

Image: 
Christine Chesley, using GeoMapApp and data from William Ryan et al., Geochemistry, Geophysics, Geosystems (2009)

The Hikurangi Margin, located off the east coast of the North Island of New Zealand, is where the Pacific tectonic plate dives underneath the Australian tectonic plate, in what scientists call a subduction zone. This interface of tectonic plates is partly responsible for the more than 15,000 earthquakes the region experiences each year. Most are too small to be noticed, but between 150 and 200 are large enough to be felt. Geological evidence suggests that large earthquakes happened in the southern part of the margin before human record-keeping began.

Geophysicists, geologists, and geochemists from throughout the world have been working together to understand why this plate boundary behaves as it does, producing both imperceptible silent earthquakes, but also potentially major ones. A study published today in the journal Nature offers new perspective and possible answers.

Scientists knew that the ocean floor at the northern part of the island, where the plates slide slowly together, generates the small, slow-moving earthquakes called slow slip events--movements that take weeks, sometimes months to complete. But at the southern end of the island, instead of sliding slowly as they do in the northern area, the tectonic plates lock. This locking sets up the conditions for a sudden release of the plates, which can trigger a large earthquake.

"It is really curious and not understood why, in a relatively small geographic area, you would go from lots of small, slow-moving earthquakes to a potential for a really large earthquake," said marine electromagnetic geophysicist Christine Chesley, a graduate student at Columbia University's Lamont-Doherty Earth Observatory and lead author on the new paper. "That's what we've been trying to understand, the difference in this margin."

In December 2018, a research team began a 29-day deep-sea cruise to collect data. Similar to taking an MRI of the Earth, the team employed electromagnetic wave energy to measure how current moves through features in the ocean floor. From these data, the team was able to get a more precise look at the role seamounts, large undersea mountains, play in generating earthquakes.

"The northern part of the margin has really large seamounts. It had been unclear what those mountains can do when they subduct (dive down into the deep earth) and how that dynamic affects the interaction between the two plates," said Chesley.

It turns out, the seamounts hold a lot more water than geophysicists had expected -- about three to five times more than typical oceanic crust. The abundant water lubricates the plates where they join, helping to smooth any slippage, and preventing the plates from the sticking that can set up a large earthquake. This helps explain the tendency toward the slow, silent earthquakes at the northern end of the margin.

Using these data, Chesley and her colleagues were also able to closely examine what is happening as a seamount subducts. They discovered an area in the upper plate that seems to be damaged by a subducting seamount. This upper plate zone also seemed to have more water in it.

"That suggests the seamount is breaking up the upper plate, making it weaker, which helps explain the unusual pattern of silent earthquakes there," said Chesley. The example provides another indication of how seamounts influence tectonic behavior and earthquake hazards.

Conversely, the lack of lubrication and the weakening effects of seamounts may make the southern part of the island more prone to sticking and generating large earthquakes.

Chesley, who is on track to complete her Ph.D. in the fall, hopes that these findings will encourage researchers to consider the way water in these seamounts contributes to seismic behavior as they continue to work to understand slow-moving earthquakes. "The more we study earthquakes, the more it seems water plays a starring role in modulating slip on faults," said Chesley. "Understanding when and where water is input into the system can only improve natural hazard assessment efforts."

Credit: 
Columbia Climate School

Microscopy technique makes finer images of deeper tissue, more quickly

To create high-resolution, 3D images of tissues such as the brain, researchers often use two-photon microscopy, which involves aiming a high-intensity laser at the specimen to induce fluorescence excitation. However, scanning deep within the brain can be difficult because light scatters off of tissues as it goes deeper, making images blurry.

Two-photon imaging is also time-consuming, as it usually requires scanning individual pixels one at a time. A team of MIT and Harvard University researchers has now developed a modified version of two-photon imaging that can image deeper within tissue and perform the imaging much more quickly than what was previously possible.

This kind of imaging could allow scientists to more rapidly obtain high-resolution images of structures such as blood vessels and individual neurons within the brain, the researchers say.

"By modifying the laser beam coming into the tissue, we showed that we can go deeper and we can do finer imaging than the previous techniques," says Murat Yildirim, an MIT research scientist and one of the authors of the new study.

MIT graduate student Cheng Zheng and former postdoc Jong Kang Park are the lead authors of the paper, which appears today in Science Advances. Dushan N. Wadduwage, a former MIT postdoc who is now a John Harvard Distinguished Science Fellow in Imaging at the Center for Advanced Imaging at Harvard University, is the paper's senior author. Other authors include Josiah Boivin, an MIT postdoc; Yi Xue, a former MIT graduate student; Mriganka Sur, the Newton Professor of Neuroscience at MIT; and Peter So, an MIT professor of mechanical engineering and of biological engineering.

Deep imaging

Two-photon microscopy works by shining an intense beam of near-infrared light onto a single point within a sample, inducing simultaneous absorption of two photons at the focal point, where the intensity is the highest. This long-wavelength, low-energy light can penetrate deeper into tissue without damaging it, allowing for imaging below the surface.

However, two-photon excitation generates images by fluorescence, and the fluorescent signal is in the visible spectral region. When imaging deeper into tissue samples, the fluorescent light scatters more and the image becomes blurry. Imaging many layers of tissue is also very time-consuming. Using wide-field imaging, in which an entire plane of tissue is illuminated at once, can speed up the process, but the resolution of this approach is not as great as that of point-by-point scanning.

The MIT team wanted to develop a method that would allow them to image a large tissue sample all at once, while still maintaining the high resolution of point-by-point scanning. To achieve that, they came up with a way to manipulate the light that they shine onto the sample. They use a form of wide-field microscopy, shining a plane of light onto the tissue, but modify the amplitude of the light so that they can turn each pixel on or off at different times. Some pixels are lit up while nearby pixels remain dark, and this predesigned pattern can be detected in the light scattered by the tissue.

"We can turn each pixel on or off by this kind of modulation," Zheng says. "If we turn off some of the spots, that creates space around each pixel, so now we can know what is happening in each of the individual spots."

After the researchers obtain the raw images, they reconstruct each pixel using a computer algorithm that they created.

"We control the shape of the light and we get the response from the tissue. From these responses, we try to resolve what kind of scattering the tissue has. As we do the reconstructions from our raw images, we can get a lot of information that you cannot see in the raw images," Yildirim says.

Using this technique, the researchers showed that they could image about 200 microns deep into slices of muscle and kidney tissue, and about 300 microns into the brains of mice. That is about twice as deep as was possible without this patterned excitation and computational reconstruction, Yildirim says. The technique can also generate images about 100 to 1,000 times faster than conventional two-photon microscopy.

Brain structure

This type of imaging should allow researchers to more rapidly obtain high-resolution images of neurons in the brain, as well as other structures such as blood vessels. Imaging blood vessels in the brains of mice could be particularly useful for learning more about how blood flow is affected by neurodegenerative diseases such as Alzheimer's, Yildirim says.

"All the studies of blood flow or morphology of the blood vessel structures are based on two-photon or three-photon point scanning systems, so they're slow," he says. "By using this technology, we can really perform high-speed volumetric imaging of blood flow and blood vessel structure in order to understand the changes in blood flow."

The technique could also lend itself to measuring neuronal activity, by adding voltage-sensitive fluorescent dyes or fluorescent calcium probes that light up when neurons are excited. It could also be useful for analyzing other types of tissue, including tumors, where it could be used to help determine the edges of a tumor.

Credit: 
Massachusetts Institute of Technology

Researchers identify ultrastable single atom magnet

image: Dr. Aparajita Singha with one of the ESR-enabled STM systems at QNS

Image: 
QNS

Researchers at the IBS Center for Quantum Nanoscience at Ewha Womans University (QNS) have shown that dysprosium atoms resting on a thin insulating layer of magnesium oxide have magnetic stability over days. In a study published in Nature Communications they have proven that these tiny magnets have extreme robustness against fluctuations in magnetic field and temperature and will flip only when they are bombarded with high energy electrons through the STM-tip.

Using these ultra-stable and yet switchable single-atom magnets, the team has shown atomic-scale control of the magnetic field within artificially built quantum architectures. "The atomic scale tunability and precision engineering of magnetic fields shown in this work adds a new paradigm for quantum logic devices and quantum computation." says Dr Aparajita Singha who conducted the research as a post doc at QNS and is now a group leader at the Max Planck Institute for Solid State Research.

Although magnetism arises at the level of single atoms, also called unpaired spins, small atomic clusters are generally magnetically very unstable without careful control of their surroundings. Understanding magnetic properties at such small scales is a fundamental physics problem, which has become technically very important for creating qubits - the building blocks for quantum computation.

Magnetism at such small scales can be studied and controlled using quantum tunneling through sharp electrode probes in a Scanning Tunneling Microscope (STM). The fingerprint of these atomic spins can be measured using single-atom electron spin resonance (ESR). The research team at QNS combined the use these powerful techniques to find the right conditions for achieving the long-sought robust single-atom magnet.

"Creating the smallest ultra-stable magnets was far from a small effort. It needed operating at the limits of measurement techniques and finding just the right conditions. On a double-layer MgO substrate, the Dy atom is almost isolated but still feels enough directionality to maintain a defined polarity over days," according to Dr. Singha.

To be able to freeze single atoms and measure their miniscule signals, the team created an extreme physical environment, including: (a) temperatures 1000s of times smaller than room temperature, at which atoms stop drifting on surfaces, (b) vacuum stronger than empty space, so that atoms do not get contaminated by impurities which would otherwise bias our results, and (c) ultraclean crystalline surfaces with almost nothing on top other than the desired single atoms. As for the tool itself, they picked up single Fe (iron) atoms one-by-one on the STM-tip until achieving enough signal-to-noise ratio in ESR, even in absence of any external magnetic field (generally 30-50 atoms). Since the electronic states of an ultrastable Dy-atom magnets (4f orbitals) are too shielded for STM measurements, the researchers measured its magnetic field projection on a more easily measurable sensor Fe-atom, placed at defined locations on the same surface. Using the same STM-tip, they also arranged single Dy-atom magnets at different lattice locations of the crystalline substrate around the sensor Fe-atom. Deliberate flipping of the individual Dy-atom magnets changed the magnetic field at the sensor Fe-atom location with precise discreteness, which was then measured to be stable over days using ESR.

Switchable ultrastable single-atom magnets placed at atomically precise locations provide a toolbox for extremely local but precise control of magnetic fields. Once the magnetic state is set, it is maintained automatically without any need of huge and expensive external magnets. Dr. Singha concluded that, "the atomic-scale tunability of magnetic field is a powerful control tool for future surface-based quantum circuits."

Credit: 
Institute for Basic Science

Beyond 5G: Wireless communications may get a boost from ultra-short collimating metalens

image: Ultra-short collimating metalens with a distance of only one millimeter.

Image: 
Takehito Suzuki/ TUAT

Screens may be larger on smartphones now, but nearly every other component is designed to be thinner, flatter and tinier than ever before. The engineering requires a shift from shapely, and bulky lenses to the development of miniaturized, two-dimensional metalenses. They might look better, but do they work better?

A team of Japan-based researchers says yes, thanks to a solution they published on July 7th in Applied Physics Express, a journal of the Japan Society of Applied Physics.

The researchers previously developed a low-reflection metasurface -- an ultra-thin interface that can manipulate electromagnetic waves -- specifically to control terahertz waves. These waves overlap millimeter waves and infrared waves, and, while they can transmit a significant amount of data, they easily attenuate in the atmosphere.

The technology may not be suitable for long-range wireless communications, but could improve short-range data exchanges, such as residential internet speeds, said paper author Takehito Suzuki, associate professor in the Institute of Engineering at Tokyo University of Agriculture and Technology. According to Suzuki, the researchers have taken a step toward such application developments by using their metasurface to craft the world's "top" ultra-short metalens that collimates to align an optical system with a distance of only one millimeter. The metalens is capable of increasing transmitted power by three at the far field, where the signal strength typically weakens.

"Terahertz flat optics based on our originally developed low-reflection metasurface with a high-refractive index can offer attractive two-dimensional optical components for the manipulation of terahertz waves," Suzuki said.

The challenge was whether the collimating lens, which converts approximately spherical-shaped terahertz waves to aligned terahertz waves, made with the metasurface, could be mounted closely to the electronics -- called a resonant tunneling diode -- that transmits terahertz waves at the right frequency and in the right direction. The minimal distance between the diode and the metalens is the necessary ingredient in current and future electronic devices, Suzuki said.

"We resolved this problem," Suzuki said. "We integrated a fabricated collimating metalens made with our original metasurface with a resonant tunneling diode at a distance of one millimeter." Measurements verify that the collimating metalens integrated with the resonant tunneling diode enhances the directivity to three times that of a single resonant tunneling diode.

The researchers tuned their device to 0.3 terahertz, a band at a higher frequency than the one used for 5G wireless communications. The manipulation of higher-frequency electromagnetic waves allows the upload and download of huge amounts of data in 6G wireless communications, according to Suzuki.

"The 0.3 terahertz band is a promising candidate for 6G offering advanced cyber-physical systems," Suzuki said. "And our presented collimating metalens can be simply integrated with various terahertz continuous-wave sources to accelerate the growth of emerging terahertz industry such as 6G wireless communications."

Credit: 
Tokyo University of Agriculture and Technology

AI predicts diabetes risk by measuring fat around the heart

A team led by researchers from Queen Mary University of London has developed a new artificial intelligence (AI) tool that is able to automatically measure the amount of fat around the heart from MRI scan images.

Using the new tool, the team was able to show that a larger amount of fat around the heart is associated with significantly greater odds of diabetes, independent of a person's age, sex, and body mass index.

The research is published in the journal Frontiers in Cardiovascular Medicine and is the result of funding from the CAP-AI programme, which is led by Barts Life Sciences, a research and innovation partnership between Queen Mary University of London and Barts Health NHS Trust.

The distribution of fat in the body can influence a person's risk of developing various diseases. The commonly used measure of body mass index (BMI) mostly reflects fat accumulation under the skin, rather than around the internal organs. In particular, there are suggestions that fat accumulation around the heart may be a predictor of heart disease, and has been linked to a range of conditions, including atrial fibrillation, diabetes, and coronary artery disease.

Lead researcher Dr Zahra Raisi-Estabragh from Queen Mary University of London said: "Unfortunately, manual measurement of the amount of fat around the heart is challenging and time-consuming. For this reason, to date, no-one has been able to investigate this thoroughly in studies of large groups of people.

"To address this problem, we've invented an AI tool that can be applied to standard heart MRI scans to obtain a measure of the fat around the heart automatically and quickly, in under three seconds. This tool can be used by future researchers to discover more about the links between the fat around the heart and disease risk, but also potentially in the future, as part of a patient's standard care in hospital."

The research team tested the AI algorithm's ability to interpret images from heart MRI scans of more than 45,000 people, including participants in the UK Biobank, a database of health information from over half a million participants from across the UK. The team found that the AI tool was accurately able to determine the amount of fat around the heart in those images, and it was also able to calculate a patient's risk of diabetes.

Dr Andrew Bard from Queen Mary University of London, who led the technical development, added: "The AI tool also includes an in-built method for calculating uncertainty of its own results, so you could say it has an impressive ability to mark its own homework."

Professor Steffen Petersen from Queen Mary University of London, who supervised the project, said: "This novel tool has high utility for future research and, if clinical utility is demonstrated, may be applied in clinical practice to improve patient care. This work highlights the value of cross-disciplinary collaborations in medical research, particularly within cardiovascular imaging."

Credit: 
Queen Mary University of London

The reproductive advantages of large male fish

image: A male mosquitofish (Gambusia holbrooki) attempting to mate with a female. Photo: Andrew Kahn

Image: 
Photo: Andrew Kahn

In mosquitofish, of the genus Gambusia, male fish are smaller than females - sometimes only half the size. Biologists had previously assumed that smaller male mosquitofish had at least some reproductive advantages. Researchers from the transregional collaborative research centre NC³ at Bielefeld University have shown in a systematic review and meta-analysis that larger mosquitofish are actually more successful at reproduction: they can, for instance, better challenge their rivals; they produce more sperm; and they are preferred by female fish. The re-searchers are presenting their findings today (07.07.2021) in the Journal of Animal Ecology.

Mosquitofish are small fish with nondescript coloring of the genus Gambusia, which contains some 45 species. While female mosquitofish can be up to 7 centimetres long, males are often just about 4 centimetres long. Their sizes, however, do vary and these fish are therefore often used to study sexual selection based on body size. "Even though there have been many studies on whether body size in males confers reproductive advantages, the findings have been mixed," says Alfredo Sánchez-Tójar, a biologist who is working in the transregional collaborative research centre NC³ (SFB-TRR 212) in a subproject investigating the behavioural ecology of individualised niches using meta-analyses (subproject D05).

Studies have indeed shown that larger male fish are better at driving off rivals, and that females prefer larger males when it comes to mating. Male mosquitofish, however, usually circumvent the cooperation of the female by forcing copulation. Because male fish are smaller, they are more agile and are better at lying in wait, making them more successful in these forced mating strategies.

For their systematic review and meta-analysis, the NC³ researchers evaluated 36 different studies that had investigated the correlation between body size and reproductive performance in male Gambusia fish. 'Our work demonstrates that larger male Gambusia fish actually have greater reproductive success than their smaller counterparts. This correlation is surprising - we had assumed that the advantages of small male fish would carry greater weight,' says Sánchez-Tójar.

'This meta-analysis brings together many years of research on this topic and enables biologists to study additional questions in this field in the future. Such systematic reviews and meta-analyses are becoming ever more important, in part because the scientific literature is constantly increasing,' says Professor Dr. Klaus Reinhold, a member of Bielefeld University's Faculty of Biology and the head of the Evolutionary Biology research group as well as the NC³ -Subproject D05. Reinhold and Sánchez-Tójar conducted the systematic review and meta-analysis together with two other researchers: Dr. Nicholas Patrick Moran, from the National Institute of Aquatic Resources (DTU Aqua) at the Technical University of Denmark, and Bora Kim, from the Konrad Lorenz Institute of Ethology at the University of Veterinary Medicine in Vienna. Moran and Kim both previously participated in the collaborative research centre NC³ .

In their meta-analysis, the researchers included studies in which the correlation between body size and reproductive performance was not part of the study's research question, but data on this was still collected. Using pre-registration, the researchers documented their hypotheses and methods in advance before performing the actual analysis. 'Such strategies are important for a review to be meaningful - and for the results to be as unbiased as possible,' says Sánchez-Tójar.

The studies under review had measured reproductive performance in many different ways: data had been collected, for example, on which males were preferred by female Gambusia, whether copulation was successful and whether it resulted in paternity, or on the quality and quantity of sperm. 'Our work shows that the positive correlation between reproductive performance and body size is robust in all of these areas,' says Sánchez-Tójar. The largest effect size is seen in mate selection by females: the larger the male fish, the more likely it is for female fish to mate with him. 'This is particularly interesting because the influence of female mate selection in mosquitofish had previously been neglected in the literature, as the focus was often placed on coercive mating,' explains Sánchez-Tójar.

The scientists hope to make a mark on future research with their study. 'In some categories of reproductive performance, the findings are very heterogenous, which also has to do with study design. In some cases, the experimental environment is not complex enough because, for example, vegetation is missing or temperatures and periods of light exposure are unrealistic,' says Kim, who is the lead author on the study. Moran adds: 'This systematic review and meta-analysis provides a solid foundation upon which to further refine research questions and methods.'

The aim of subproject D05 is to produce these types of meta-analyses on the topics under investigation in the NC³ transregional collaborative research centre. NC³ stands for 'Niche Choice, Niche Conformance, Niche Construction.' This collaborative research centre includes locations at Bielefeld University, the University of Münster, and the University of Jena and investigates ecological niches at the individual level, bringing together behavioral biology, ecology, and evolutionary biology with theoretical biology and philosophy. The German Research Foundation (DFG) has been funding NC³ since January 2018, initially for a period of four years, with a total of 8.5 million Euro.

Credit: 
Bielefeld University