Tech

Exploration of ocean currents beneath the 'Doomsday Glacier'

image: Photo of the uncrewed submarine Ran

Image: 
Filip Stedt

For the first time, researchers have been able to obtain data from underneath Thwaites Glacier, also known as the "Doomsday Glacier". They find that the supply of warm water to the glacier is larger than previously thought, triggering concerns of faster melting and accelerating ice flow.

With the help of the uncrewed submarine Ran that made its way under Thwaites glacier front, the researchers have made a number of new discoveries. Professor Karen Heywood of the University of East Anglia commented:

"This was Ran's first venture to polar regions and her exploration of the waters under the ice shelf was much more successful than we had dared to hope. We plan to build on these exciting findings with further missions under the ice next year."

The submersible has, among other things, measured the strength, temperature, salinity and oxygen content of the ocean currents that go under the glacier.

Global sea level is affected by how much ice there is on land, and the biggest uncertainty in the forecasts is the future evolution of the West Antarctic Ice Sheet, says Anna Wåhlin, professor of oceanography at the University of Gothenburg and lead author of the new study now published in Science Advances.

Impacts global sea level

The ice sheet in West Antarctica accounts for about ten percent of the current rate of sea level rise; but also the ice in West Antarctica holds the most potential for increasing that rate because the fastest changes worldwide are taking place in the Thwaites Glacier. Due to its location and shape, Thwaites is particularly sensitive to warm and salty ocean currents that are finding their way underneath it.

This process can lead to an accelerated melting taking place at the bottom of the glacier and inland movement of the so-called grounding zone, the area where the ice transitions from resting on the seabed to floating in the ocean.

Due to its inaccessible location, far from research stations, in an area that is usually blocked by thick sea ice and many icebergs, there has been a great shortage of in situ measurements from this area. This means that there are big knowledge gaps for the ice-ocean boundary processes in this region.

First measurements performed

In the study, the researchers present the results from the submersible that measured strength, temperature, salinity and oxygen content of the ocean currents that go under the glacier.

"These were the first measurements ever performed beneath Thwaites glacier", says Anna Wåhlin.

The results have been used to map the ocean currents underneath the floating part of the glacier. The researchers discovered that there is a deep connection to the east through which deep water flows from Pine Island Bay, a connection that was previously thought to be blocked by an underwater ridge.

The research group has also measured the heat transport in one of the three channels that lead warm water towards Thwaites Glacier from the north. "The channels for warm water to access and attack Thwaites weren't known to us before the research. Using sonars on the ship, nested with very high-resolution ocean mapping from Ran, we were able to find that there are distinct paths that water takes in and out of the ice shelf cavity, influenced by the geometry of the ocean floor" says Dr Alastair Graham, University of Southern Florida.

The value measured there, 0.8 TW, corresponds to a net melting of 75 km3 of ice per year, which is almost as large as the total basal melt in the entire ice shelf. Although the amount of ice that melts as a result of the hot water is not much compared to other global freshwater sources, the heat transport has a large effect locally and may indicate that the glacier is not stable over time.

Not sustainable over time

The researchers also noted that large amounts of meltwater flowed north away from the front of the glacier.

Variations in salinity, temperature and oxygen content indicate that the area under the glacier is a previously unknown active area where different water masses meet and mix with each other, which is important for understanding the melting processes at the base of the ice.

The observations show warm water approaching from all sides on pinning points, critical locations where the ice is connected to the seabed and give stability to the ice shelf. Melting around these pinning points may lead to instability and retreat of the ice shelf and, subsequently, the upstream glacier flowing off the land. Dr Rob Larter of the British Antarctic Survey commented:

"This work highlights that how and where warm water impacts Thwaites Glacier is influenced by the shape of the sea floor and the ice-shelf base as well as the properties of the water itself. The successful integration of new sea-floor survey data and observations of water properties from the Ran missions shows the benefits of the multidisciplinary ethos within the International Thwaites Glacier Collaboration."

"The good news is that we are now, for the first time, collecting data that is necessary to model the dynamics of Thwaite's glacier. This data will help us better calculate ice melting in the future. With the help of new technology, we can improve the models and reduce the great uncertainty that now prevails around global sea level variations." says Anna Wåhlin.

Credit: 
University of Gothenburg

Men with low health literacy less likely to choose active surveillance for prostate cancer after tumor profiling

image: Dr. Peter Gann, professor of pathology at UIC

Image: 
Joshua Clark/UIC

Active surveillance leads to improved quality of life

Men with low health literacy seven times less likely to accept active surveillance

Prostate cancer and active surveillance patient education is needed

Tumor gene profiling is a tool that can help patients with a cancer diagnosis make informed decisions about treatment. In predominantly white populations, among men with early stage, favorable-risk prostate cancer, these tools have been shown to increase patient acceptance of active surveillance -- a common, evidence-based approach to monitor the tumor before a more aggressive treatment, like surgery or radiation.

However, a new study from researchers at the University of Illinois Chicago and Northwestern University shows that in a predominantly Black, urban patient population with substantial social disadvantage, tumor profiling had the opposite effect among men with clinically similar prostate cancers -- it decreased patient acceptance of active surveillance. In fact, men with low health literacy were more than seven times less likely to accept active surveillance if their tumors were profiled, compared with those with high health literacy.

"The data presented in this study provide important evidence that tumor profiling has a different impact in high-risk populations and in populations with less access to health services and education," said Dr. Peter Gann, professor of pathology at the UIC College of Medicine and corresponding author of the study.

The findings are published today in the Journal of Clinical Oncology.

"We generally consider acceptance of active surveillance to be a good thing, as it can lead to improved quality of life and a longer time without treatment side effects," said Dr. Adam Murphy, assistant professor of urology at Northwestern University Feinberg School of Medicine and first author of the study. "Knowing that low health literacy may discourage men from selecting active surveillance, efforts should be made to provide prostate cancer and active surveillance-focused education for men with low-risk prostate cancer, so that they can make informed treatment decisions."

"It will be years before we can evaluate if outcomes vary as a result of these decisions, but it is vital that we understand how diverse communities are affected by these test results so that we can support confident, informed decision making," said Gann, who is member of the University of Illinois Cancer Center of UIC.

The study was conducted as part of a clinical trial called ENACT, for Engaging Newly Diagnosed Men About Cancer Treatment Options. The trial is the first to use a randomized design to evaluate the impact of a genomic test on treatment choice.

In the study, the researchers enrolled 200 men from three public hospitals in Chicago whose clinical findings put them in the very low to low-intermediate prostate cancer risk category, meaning all participants were considered candidates for active surveillance. The participants were randomly assigned at diagnosis to receive standard counseling, or standard counseling plus a discussion of tumor gene profiling test results.

For the intervention group, the Oncotype DX Genomic Prostate Score, or GPS, was used. GPS analyzes tumor cells and measures the activity of certain genes, and then "scores" the aggressiveness of the cancer. The results are presented as probabilities of bad outcomes.

"Because the GPS test has been validated in mostly White patient populations, we particularly wanted to know how the test would affect Black patients' decision-making process for selecting a course of action for a favorable-risk prostate cancer diagnosis," Gann said.

Of the participants, 70% were Black, 16% had a college degree, 46% were classified as having low health literacy, and 16% were uninsured. Health literacy was measured by an individual's ability to understand information about their health.

Overall, the vast majority (82%) of participants enrolled in the trial chose active surveillance, while the others chose immediate treatment with surgery or radiation. But acceptance of active surveillance was lower in the group that received GPS results (74%) compared with those who did not receive GPS results (88%). Participants with low health literacy who received GPS results were seven times less likely to choose active surveillance.

In addition, Gann and Murphy found that men with a positive family history of prostate cancer were significantly more likely to choose surveillance. "This was surprising. It could be that these men are more familiar with the rising acceptability of a surveillance approach, as well as the risk of treatment-related morbidity," Gann said.

Insurance also is an important factor in enabling patients to select active surveillance, Murphy noted. "Insurance coverage will promote compliance with the serial visits for PSA tests, prostate exams and prostate biopsies that are a part of active surveillance monitoring," said Murphy, who is a member of the Robert H. Lurie Comprehensive Cancer Center of Northwestern University.

A follow up study is planned that will look at whether tumor profiling with GPS and prostate MRI can improve the safety of active surveillance in high-risk men, thanks to renewed funding for the ENACT clinical trial.

Credit: 
University of Illinois Chicago

Glass injection molding

image: Injection-molded structures made from the newly developed Glassomer composite.

Image: 
Photo: Neptun Lab/University of Freiburg

Glass is ubiquitous, from high-tech products in the fields of optics, telecommunications, chemistry and medicine to everyday objects such as bottles and windows. However, shaping glass is mainly based on processes such as melting, grinding or etching. These processes are decades old, technologically demanding, energy-intensive and severely limited in terms of the shapes that can be realized. For the first time, a team led by Prof. Dr. Bastian E. Rapp from the Laboratory of Process Technology at the Department of Microsystems Engineering at the University of Freiburg, in collaboration with the Freiburg-based start-up Glassomer, has developed a process that makes it possible to form glass easily, quickly and in almost any shape using injection molding. The researchers presented their results in the journal Science.

"For decades, glass has often been the second choice when it comes to materials in manufacturing processes because its formation is too complicated, energy-intensive and unsuitable for producing high-resolution structures," explains Rapp. "Polymers, on the other hand, have allow all of this, but their physical, optical, chemical and thermal properties are inferior to glass. As a result, we have combined polymer and glass processing. Our process will allow us to quickly and cost-effectively replace both mass-produced products and complex polymer structures and components with glass."

Injection molding is the most important process in the plastics industry and enables the fast and cost-effective production of components in so-called high-throughput in almost any shape and size. Transparent glass could not be molded in this process until now. With the newly developed Glassomer injection molding technology from a special granulate designed in-house, it is now possible to also mold glass in high throughput at just 130 °C. The injection-molded components from the 3D printer are then converted into glass in a heat treatment process: The result is pure quartz glass. This process requires less energy than conventional glass melting, resulting in energy efficiency. The formed glass components have a high surface quality, so that post-treatment steps such as polishing are not required.

The novel designs made possible by Glassomer's glass injection molding technology have a wide range of applications from data technology, optics and solar technology to a so-called lab-on-a-chip and medical technology. "We see great potential especially for small high-tech glass components with complicated geometries. In addition to transparency, the very low coefficient of expansion of quartz glass also makes the technology interesting. Sensors and optics work reliably at any temperature if the key components are made of glass," explains Dr. Frederik Kotz, group leader at the Laboratory of Process Technology and Chief Scientific Officer (CSO) at Glassomer. "We have also been able to show that micro-optical glass coatings can increase the efficiency of solar cells. This technology can now be used to produce cost-effective high-tech coatings with high thermal stability. There are a number of commercial opportunities for it."

The team around Frederik Kotz and Markus Mader, a doctoral student at the Laboratory of Process Technology, solved previously existing problems in the injection molding of glass such as porosity and particle abrasion. In addition, key process steps in the new method were designed to use water as the base material, making the technology more environmentally friendly and sustainable.

Credit: 
University of Freiburg

Bird blood is a heating system in winter

Researchers at Lund University in Sweden have discovered that bird blood produces more heat in winter, when it is colder, than in autumn. The study is published in The FASEB Journal.

The secret lies in the energy factories of cells, the mitochondria. Mammals have no mitochondria in their red blood cells, but birds do, and according to the research team from Lund and Glasgow this means that the blood can function as a central heating system when it is cold.

"In winter, the mitochondria seem to prioritize producing more heat instead of more energy. The blood becomes a type of radiator that they can turn up when it gets colder", says Andreas Nord, researcher in evolutionary ecology at Lund University who led the study.

Until now, the common perception has been that birds keep warm by shivering with their large pectoral muscles and fluffing up their feathers. Less is known about other heat-regulating processes inside birds.

To investigate the function of mitochondria, the researchers examined great tits, coal tits and blue tits on two different occasions: early autumn and late winter. The researchers took blood samples from the birds and isolated the red blood cells. By using a so-called cell respirometer, a highly sensitive instrument that can measure how much oxygen the mitochondria consume, the researchers were able to calculate how much of the oxygen consumption was spent on producing energy and how much was spent on creating heat. Finally, they also measured the amount of mitochondria in each blood sample.

The results show that the blood samples taken in winter contained more mitochondria and that the mitochondria worked harder. However, the work was not to produce more energy, something the researchers had assumed since birds have a much higher metabolism in winter.

"We had no idea that the birds could regulate their blood as a heating system in this way, so we were surprised", says Andreas Nord.

The researchers will now investigate whether cold weather is the whole explanation for the birds' blood producing more heat in winter. Among other things, they will study whether the food that the birds eat in winter affects the mitochondria.

Credit: 
Lund University

Learning what makes the nucleus tick

image: The graphic shows an unusual nuclear event in a beryllium-6 atom, where a pair of protons are released. Understanding the inner workings of the nucleus is key to research at FRIB.

Image: 
Facility for Rare Isotope Beams

Michigan State University's Witold Nazarewicz has a simple way to describe the complex work he does at the Facility for Rare Isotope Beams (frib.msu.edu), or FRIB.

"I study theoretical nuclear physics," said Nazarewicz, John A. Hannah Distinguished Professor of Physics and chief scientist at FRIB. "Nuclear theorists want to know what makes the nucleus tick."

There is a nucleus in every atom. Atoms, in turn, make up matter -- the stuff we interact with every day. But the nucleus is still shrouded in mystery. One of FRIB's goals in creating rare isotopes, or different forms of elements, is to better understand what's going on inside the cores of atoms.

In a new paper for Physical Review Letters, published online April 7, Simin Wang, a former research associate at FRIB, and Nazarewicz show how FRIB can spot signatures of unusual nuclear events and use those as windows into the nucleus.

"There will be a program at FRIB devoted to such measurements," said Nazarewicz. "What we want to do is to understand the structure of the nucleus."

As any child can attest, one of the best ways to understand how something works is to take it apart. In making rare isotopes, FRIB will create exotic nuclei that naturally fall apart or decay.

While some FRIB staff finish construction of the physical facility -- which is scheduled to start scientific experiments in 2022 -- theorists including Wang and Nazarewicz are developing computer models that will help interpret the new science it churns out as well as make predictions about nuclear behavior.

Nuclei are themselves built from subatomic particles known as protons and neutrons. There are certain nuclei that decay by creating pairs of protons or neutrons within the nucleus and then spitting them out.

For instance, this is the case for an isotope known as beryllium-6, which is a beryllium atom with four protons and two neutrons in its nucleus. Inside beryllium-6, the protons can pair up and when the nucleus decays by releasing one such pair, FRIB's detectors will be able to spot the ejected particles.

What Wang and Nazarewicz have done is built a computer model that lets them essentially reconstruct what those protons looked like inside the nucleus based on what FRIB's detectors see.

"We're measuring those particles as probes, not because we are particularly interested in protons," Nazarewicz said. "Those protons are messengers, carrying information about the nucleus from which they were emitted."

The model also works similarly for rare nuclei that decay by emitting pairs of neutrons.

One of the biggest challenges of the work was developing a computer model that could trace these particles over a tremendous span of length scales.

Nuclei are measured in femtometers, mere quadrillionths of a meter. But FRIB's detectors are, roughly speaking, a meter apart. For perspective, there are far more femtometers between your two pupils than there are meters between Earth and the sun.

Yet the Spartans' model had to be able to account for goings-on at both the femtometer scale and the much larger distances that particles must cover to reach the detector.

"You must be able to properly characterize the particles inside the nucleus and follow them as they decay from the nucleus and travel to the detectors," Nazarewicz said. "It's not trivial to do calculations across those scales."

Nazarewicz credits Wang with powering through that challenge and driving the project to a successful conclusion. And, although Wang concedes it was difficult, he hopes that people remember not how hard the work was, but how exciting it is.

"Most of my research career has been devoted to developing theoretical tools connecting nuclear structure and experimental observables, so I can't describe how excited I am that FRIB is nearing completion," Wang said.

"Because the observables calculated with our new tool can be directly compared with experimental measurements, we'll be able to make a lot of predictions and discover many new phenomena," Wang said. "It will be a great era."

Credit: 
Michigan State University Facility for Rare Isotope Beams

Sales of sugar sweetened beverages decline after SA introduces Health Promotion Levy - study

Led by a South African team at the South African Medical Research Council Centre for Health Economics and Decision Science (PRICELESS-SA) in the School of Public Health at the University of the Witwatersrand, Johannesburg (Wits), and the University of the Western Cape, in partnership with the University of North Carolina, USA, the study was published on 8 April in The Lancet Planetary Health.

South Africa faces an increasing burden of non-communicable diseases (NCDs) such as diabetes, hypertension, cardiovascular disease and cancers - diseases that can be linked to increased consumption of sugar, particularly from beverages.

Many countries, including Mexico, have used policies such as taxation to successfully curb consumption of sugary beverages.

South Africa's 2018 Health Promotion Levy, placed a tax on sugary beverages, with the tax amount related to the amount of sugar in the drink and the first teaspoon untaxed.

Less sugar, calories and SSB purchases

In the study, titled "Changes in beverage purchases following the announcement and implementation of South Africa's Health Promotion Levy: an observational study", researchers examined the nutritional data of over 3,000 households' purchases before and after the tax to assess any changes in daily sugar, calories, and volume of taxed and non-taxed beverages.

Mr Nicholas Stacey, first author and Senior Researcher at PRICELESS-SA and the team found a 51% reduction in sugar, a 52% reduction in calories, and a 29% reduction in volume of beverages purchased per person per day following implementation of the tax.

"We also found that the relative reduction in the sugar content of taxable beverages was larger than that for volume, showing that industry reformulated products," says Stacey.

Changing consumer behavior beneficial

The researchers also analysed differences in purchasing behavior by household socioeconomic status, finding that households with lower socioeconomic status had purchased more taxable beverages prior to the announcement of the tax than higher socioeconomic status households, but experienced larger reductions after the announcement and implementation of the tax.

"These results back up the impact we've seen from similar policies in other countries - that beverage taxes based on sugar content can help reduce excessive sugar and energy intake," says Professor Karen Hofman, Director of PRICELESS-SA. "Importantly, this shows that the lower income households that experience the greater burden of obesity, diabetes, hypertension, and other nutrition-related non-communicable diseases, benefit greatly from this Health Promotion Levy."

Credit: 
University of the Witwatersrand

Computer model fosters potential improvements to 'bionic eye' technology

image: Researchers developed an experimentally validated advanced computer model that reproduces the shapes and positions of millions of nerve cells in the eye, as well as the physical and networking properties associated with them. Focusing on models of nerve cells that transmit visual information from the eye to the brain, the researchers identified ways to potentially increase clarity and grant color vision to future retinal prosthetic devices.

Image: 
Gianluca Lazzi

There are millions of people who face the loss of their eyesight from degenerative eye diseases. The genetic disorder retinitis pigmentosa alone affects 1 in 4,000 people worldwide.

Today, there is technology available to offer partial eyesight to people with that syndrome. The Argus II, the world's first retinal prosthesis, reproduces some functions of a part of the eye essential to vision, to allow users to perceive movement and shapes.

While the field of retinal prostheses is still in its infancy, for hundreds of users around the globe, the "bionic eye" enriches the way they interact with the world on a daily basis. For instance, seeing outlines of objects enables them to move around unfamiliar environments with increased safety.

That is just the start. Researchers are seeking future improvements upon the technology, with an ambitious objective in mind.

"Our goal now is to develop systems that truly mimic the complexity of the retina," said Gianluca Lazzi, a Provost Professor of Ophthalmology and Electrical Engineering at the Keck School of Medicine of USC and the USC Viterbi School of Engineering.

He and his USC colleagues cultivated progress with a pair of recent studies using an advanced computer model of what happens in the retina. Their experimentally validated model reproduces the shapes and positions of millions of nerve cells in the eye, as well as the physical and networking properties associated with them.

"Things that we couldn't even see before, we can now model," said Lazzi, who is also the Fred H. Cole Professor in Engineering and director of the USC Institute for Technology and Medical Systems. "We can mimic the behavior of the neural systems, so we can truly understand why the neural system does what it does."

Focusing on models of nerve cells that transmit visual information from the eye to the brain, the researchers identified ways to potentially increase clarity and grant color vision to future retinal prosthetic devices.

The eye, bionic and otherwise

To understand how the computer model could improve the bionic eye, it helps to know a little about how vision happens and how the prosthesis works.

When light enters the healthy eye, the lens focuses it onto the retina, at the back of the eye. Cells called photoreceptors translate the light into electrical impulses that are processed by other cells in the retina. After processing, the signals are passed along to ganglion cells, which deliver information from retina to brain through long tails, called axons, that are bundled together to make up the optic nerve.

Photoreceptors and processing cells die off in degenerative eye diseases. Retinal ganglion cells typically remain functional longer; the Argus II delivers signals directly to those cells.

"In these unfortunate conditions, there is no longer a good set of inputs to the ganglion cell," Lazzi said. "As engineers, we ask how we can provide that electrical input."

A patient receives a tiny eye implant with an array of electrodes. Those electrodes are remotely activated when a signal is transmitted from a pair of special glasses that have a camera on them. The patterns of light detected by the camera determine which retinal ganglion cells are activated by the electrodes, sending a signal to the brain that results in the perception of a black-and-white image comprising 60 dots.

Computer model courts new advances

Under certain conditions, an electrode in the implant will incidentally stimulate the axons of cells neighboring its target. For the user of the bionic eye, this off-target stimulation of axons results in the perception of an elongated shape instead of a dot. In a study published in IEEE Transactions on Neural Systems and Rehabilitation Engineering, Lazzi and his colleagues deployed the computer model to address this issue.

"You want to activate this cell, but not the neighboring axon," Lazzi said. "So we tried to design an electrical stimulation waveform that more precisely targets the cell."

The researchers used models for two subtypes of retinal ganglion cells, at the single-cell level as well as in huge networks. They identified a pattern of short pulses that preferentially targets cell bodies, with less off-target activation of axons.

Another recent study in the journal Scientific Reports applied the same computer modeling system to the same two cell subtypes to investigate how to encode color.

This research builds upon earlier investigations showing that people using the Argus II perceive variations in color with changes in the frequency of the electrical signal -- the number of times the signal repeats over a given duration. Using the model, Lazzi and his colleagues developed a strategy for adjusting the signal's frequency to create the perception of the color blue.

Beyond the possibility of adding color vision to the bionic eye, encoding with hues could be combined with artificial intelligence in future advances based on the system, so that particularly important elements in a person's surroundings, such as faces or doorways, stand out.

"There's a long road, but we're walking in the right direction," Lazzi said. "We can gift these prosthetics with intelligence, and with knowledge comes power."

Credit: 
Keck School of Medicine of USC

Study snapshot: How do weighted funding formulas affect charter school enrollments?

Study: "How Do Weighted Funding Formulas Affect Charter School Enrollments?"
Author: Paul Bruno (University of Illinois at Urbana-Champaign)

This study was presented today at the American Educational Research Association's 2021 Virtual Annual Meeting.

Main Findings:

The adoption of a school funding system in California that increased revenues for schools enrolling higher-need students led to an increase in the rate at which charter schools enrolled low-income students.

This effect was concentrated among charter schools initially enrolling low-income students at relatively low rates, suggesting that some charters "cream skim" high achieving, wealthier students, but that such behavior also can be mitigated.

Details:

For many, the expansion of charter schooling since the early 1990s has been a cause of concern. Among the major concerns is that charter schools will "cream skim" high-achieving, wealthier students from nearby traditional public schools, exacerbating segregation and burdening traditional schools with a combination of falling revenues and higher per-pupil costs.

Most states now adjust school funding to account for the costs of additional educational needs that certain groups of students are thought to have. These weighted student funding systems (WSF) differ in terms of which student characteristics are weighted, but additional funding weights are commonly given to students who require special education services or are English language learners or low-income.

In the study, the author analyzed the effects of a WSF policy implemented in 2013 in California that plausibly changed the incentives for charter schools to enroll disadvantaged students without a similar change of the incentives for students or their families to enroll in charter schools. The author looked at all charter schools in the state, without distinguishing nonprofits from for-profits.

With the adoption of the Local Control Funding Formula in 2013, weighted funding for low-income students increased significantly, increasing per-pupil funding provided to schools for eligible students by 300 percent or more.

The author examined changes in the gap between charter schools and traditional public schools in the share of students eligible for free and reduced-price lunch (FRL) programs, from 2012 to 2017. He found that charter schools with relatively low FRL rates compared to their local district in 2012 gradually increased their FRL enrollments relative to traditional schools in subsequent years.

In 2012, these charter schools had almost 6 percentage points fewer FRL-eligible students than analogous traditional schools (i.e., those with FRL rates below their district average). Despite these traditional schools also gradually increasing their FRL shares during this time, the gap between charter schools and traditional schools shrank in every subsequent year, and by 2017 the gap was statistically indistinguishable from zero.

During the same period, the Local Control Funding Formula did not have the same effect on charter schools that already had relatively high FRL shares. The FRL gap between these charter schools and traditional schools that also started with relatively high FRL shares in 2017 (6.7 percentage points) was only modestly smaller than it was in 2012 (9.7 percentage points) and was slightly larger than the gap in 2013 (5.8 percentage points).

"My results suggest that previous studies on charter school cream skimming may have been too optimistic," said author Paul Bruno, an assistant professor of education policy, organization, and leadership at the University of Illinois at Urbana-Champaign. "If these state funding changes altered enrollment incentives only or mostly for charter schools, and not for families or traditional schools, then my results indicate that many charter schools are avoiding enrolling low-income students."

"The primary implication for policymakers is that charter schools appear to be sensitive to the costs of providing education," said Bruno. "This matters for both the funding and the regulation of charter schools."

The author noted that when designing weighted funded systems, policymakers need to think carefully about which student characteristics should be considered.

"There are some obvious candidates, including eligibility for free lunch or special education or English learner services," said Bruno. "Not only do students with these characteristics appear to have distinctive and costly educational needs, but there is also evidence that they are underserved by charter schools in at least some cases."

The author also noted that policymakers need to ensure that formula weights are large enough to change the behaviors of charter school operators, but also are not so large that they create perverse incentives, such as discouraging schools from declassifying students as English learners.

Credit: 
American Educational Research Association

New research reveals secret to Jupiter's curious aurora activity

Auroral displays continue to intrigue scientists, whether the bright lights shine over Earth or over another planet. The lights hold clues to the makeup of a planet's magnetic field and how that field operates.

New research about Jupiter proves that point -- and adds to the intrigue.

Peter Delamere, a professor of space physics at the University of Alaska Fairbanks Geophysical Institute, is among an international team of 13 researchers who have made a key discovery related to the aurora of our solar system's largest planet.

The team's work was published April 9, 2021, in the journal Science Advances. The research paper, titled "How Jupiter's unusual magnetospheric topology structures its aurora," was written by Binzheng Zhang of the Department of Earth Sciences at the University of Hong Kong; Delamere is the primary co-author.

Research done with a newly developed global magnetohydrodynamic model of Jupiter's magnetosphere provides evidence in support of a previously controversial and criticized idea that Delamere and researcher Fran Bagenal of the University of Colorado at Boulder put forward in a 2010 paper -- that Jupiter's polar cap is threaded in part with closed magnetic field lines rather than entirely with open magnetic field lines, as is the case with most other planets in our solar system.

"We as a community tend to polarize -- either open or closed -- and couldn't imagine a solution where it was a little of both," said Delamere, who has been studying Jupiter since 2000. "Yet in hindsight, that is exactly what the aurora was revealing to us."

Open lines are those that emanate from a planet but trail off into space away from the sun instead of reconnecting with a corresponding location in the opposite hemisphere.

On Earth, for example, the aurora appears on closed field lines around an area referred to as the auroral oval. It's the high latitude ring near -- but not at -- each end of Earth's magnetic axis.

Within that ring on Earth, however, and as with some other planets in our solar system, is an empty spot referred to as the polar cap. It's a place where magnetic field lines stream out unconnected -- and where the aurorae rarely appear because of it. Think of it like an incomplete electrical circuit in your home: No complete circuit, no lights.

Jupiter, however, has a polar cap in which the aurora dazzles. That puzzled scientists.

The problem, Delamere said, is that researchers were so Earth-centric in their thinking about Jupiter because of what they had learned about Earth's own magnetic fields.

The arrival at Jupiter of NASA's Juno spacecraft in July 2016 provided images of the polar cap and aurora. But those images, along with some captured by the Hubble Space Telescope, couldn't resolve the disagreement among scientists about open lines versus closed lines.

So Delamere and the rest of the research team used computer modeling for help. Their research revealed a largely closed polar region with a small crescent-shaped area of open flux, accounting for only about 9 percent of the polar cap region. The rest was active with aurora, signifying closed magnetic field lines.

Jupiter, it turns out, possesses a mix of open and closed lines in its polar caps.

"There was no model or no understanding to explain how you could have a crescent of open flux like this simulation is producing," he said. "It just never even entered my mind. I don't think anybody in the community could have imagined this solution. Yet this simulation has produced it."

"To me, this is a major paradigm shift for the way that we understand magnetospheres."

What else does this reveal? More work for researchers.

"It raises many questions about how the solar wind interacts with Jupiter's magnetosphere and influences the dynamics," Delamere said.

Jupiter's aurorally active polar cap could, for example, be due to the rapidity of the planet's rotation -- once every 10 hours compared to Earth's once every 24 hours -- and the enormity of its magnetosphere. Both reduce the impact of the solar wind, meaning the polar cap magnetic field lines are less likely to be torn apart to become open lines.

And to what extent does Jupiter's moon Io affect the magnetic lines within Jupiter's polar cap? Io is electrodynamically linked to Jupiter, something unique in our solar system, and as such is constantly stripped of heavy ions by its parent planet.

As the paper notes, "The jury is still out on the magnetic structure of Jupiter's magnetosphere and what exactly its aurora is telling us about its topology."

Credit: 
University of Alaska Fairbanks

Discovery could help lengthen lifespan of electronic devices

image: Electron microscopy images show the degradation in action.

Image: 
University of Sydney

Ferroelectric materials are used in many devices, including memories, capacitors, actuators and sensors. These devices are commonly used in both consumer and industrial instruments, such as computers, medical ultrasound equipment and underwater sonars.

Over time, ferroelectric materials are subjected to repeated mechanical and electrical loading, leading to a progressive decrease in their functionality, ultimately resulting in failure. This process is referred to as 'ferroelectric fatigue'.

It is a main cause of the failure of a range of electronic devices, with discarded electronics a leading contributor to e-waste. Globally, tens of millions of tonnes of failed electronic devices go to landfill every year.

Using advanced in-situ electron microscopy, the School of Aerospace, Mechanical and Mechatronic Engineering researchers were able to observe ferroelectric fatigue as it occurred. This technique uses an advanced microscope to 'see', in real-time, down to the nanoscale and atomic levels.

The researchers hope this new observation, described in a paper published in Nature Communications, will help better inform the future design of ferroelectric nanodevices.

"Our discovery is a significant scientific breakthrough as it shows a clear picture of how the ferroelectric degradation process is present at the nanoscale," said co-author Professor Xiaozhou Liao, also from the University of Sydney Nano Institute.

Dr Qianwei Huang, the study's lead researcher, said: "Although it has long been known that ferroelectric fatigue can shorten the lifespan of electronic devices, how it occurs has previously not been well understood, due to a lack of suitable technology to observe it."

Co-author Dr Zibin Chen said: "With this, we hope to better inform the engineering of devices with longer lifespans."

Observational findings spark new debate

Nobel laureate Herbert Kroemer once famously asserted "The interface is the device". The observations by the Sydney researchers could therefore spark a new debate on whether interfaces - which are physical boundaries separating different regions in materials - are a viable solution to the unreliability of next-generation devices.

"Our discovery has indicated that interfaces could actually speed up ferroelectric degradation. Therefore, better understanding of these processes is needed to achieve the best performance of devices," Dr Chen said.

Credit: 
University of Sydney

A breakthrough that enables practical semiconductor spintronics

image: Weimin Chen, professor at Linköping University.

Image: 
Peter Modin/LiU

It may be possible in the future to use information technology where electron spin is used to store, process and transfer information in quantum computers. It has long been the goal of scientists to be able to use spin-based quantum information technology at room temperature. A team of researchers from Sweden, Finland and Japan have now constructed a semiconductor component in which information can be efficiently exchanged between electron spin and light at room temperature and above. The new method is described in an article published in Nature Photonics.

It is well known that electrons have a negative charge, and they also have another property, namely spin. The latter may prove instrumental in the advance of information technology. To put it simply, we can imagine the electron rotating around its own axis, similar to the way in which the Earth rotates around its own axis. Spintronics - a promising candidate for future information technology - uses this quantum property of electrons to store, process and transfer information. This brings important benefits, such as higher speed and lower energy consumption than traditional electronics.

Developments in spintronics in recent decades have been based on the use of metals, and these have been highly significant for the possibility of storing large amounts of data. There would, however, be several advantages in using spintronics based on semiconductors, in the same way that semiconductors form the backbone of today's electronics and photonics.

"One important advantage of spintronics based on semiconductors is the possibility to convert the information that is represented by the spin state and transfer it to light, and vice versa. The technology is known as opto-spintronics. It would make it possible to integrate information processing and storage based on spin with information transfer through light", says Weimin Chen, professor at Linköping University, Sweden, who led the project.

As electronics used today operates at room temperature and above, a serious problem in the development of spintronics has been that electrons tend to switch and randomise their direction of spin when the temperature rises. This means that the information coded by the electron spin states is lost or becomes ambiguous. It is thus a necessary condition for the development of semiconductor-based spintronics that we can orient essentially all electrons to the same spin state and maintain it, in other words that they are spin polarised, at room temperature and higher temperatures. Previous research has achieved a highest electron spin polarisation of around 60% at room temperature, untenable for large-scale practical applications.

Researchers at Linköping University, Tampere University and Hokkaido University have now achieved an electron spin polarisation at room temperature greater than 90%. The spin polarisation remains at a high level even up to 110 °C. This technological advance, which is described in Nature Photonics, is based on an opto-spintronic nanostructure that the researchers have constructed from layers of different semiconductor materials. It contains nanoscale regions called quantum dots. Each quantum dot is around 10,000 times smaller than the thickness of a human hair. When a spin polarised electron impinges on a quantum dot, it emits light - to be more precise, it emits a single photon with a state (angular momentum) determined by the electron spin. Thus, quantum dots are considered to have a great potential as an interface to transfer information between electron spin and light, as will be necessary in spintronics, photonics and quantum computing. In the newly published study, the scientists show that it is possible to use an adjacent spin filter to control the electron spin of the quantum dots remotely, and at room temperature.

The quantum dots are made from indium arsenide (InAs), and a layer of gallium nitrogen arsenide (GaNAs) functions as a filter of spin. A layer of gallium arsenide (GaAs) is sandwiched between them. Similar structures are already being used in optoelectronic technology based on gallium arsenide, and the researchers believe that this can make it easier to integrate spintronics with existing electronic and photonic components.

"We are very happy that our long-term efforts to increase the expertise required to fabricate highly-controlled N-containing semiconductors is defining a new frontier in spintronics. So far, we have had a good level of success when using such materials for optoelectronics devices, most recently in high-efficiency solar-cells and laser diodes. Now we are looking forward to continuing this work and to unite photonics and spintronics, using a common platform for light-based and spin-based quantum technology", says Professor Mircea Guina, head of the research team at Tampere University in Finland.

Credit: 
Linköping University

UofL biologists create better method to culture cells for testing drug toxicity

image: AJP Cell Physiology April issue cover featuring work of UofL biologists

Image: 
American Journal of Physiology-Cell Physiology

LOUISVILLE, Ky. - When a new drug is being developed, the first question is, "Does it work?" The second question is, "Does it do harm?" No matter how effective a therapy is, if it harms the patient in the process, it has little value.

Doctoral student Robert Skolik and Associate Professor Michael Menze, Ph.D., in the Department of Biology at the University of Louisville, have found a way to make cell cultures respond more closely to normal cells, allowing drugs to be screened for toxicity earlier in the research timeline.

The vast majority of cells used for biomedical research are derived from cancer tissues stored in biorepositories. They are cheap to maintain, easy to grow and multiply quickly. Specifically, liver cancer cells are desirable for testing the toxicity of drugs for any number of diseases.

"You like to use liver cells because this is the organ that would detoxify whatever drug for whatever treatment you are testing," Menze said. "When new drugs are being developed for diabetes or another disease, one of the concerns is whether they are toxic to the liver."

The cells do come with limitations, however. Since they are cancer cells, they may not be as sensitive to toxins as normal cells, so they may not reveal issues with toxicity that can appear much later in the drug testing process.

Skolik and Menze have discovered that by changing two components of the media used to culture the cells, they can make liver cancer cells behave more like normal liver cells. Rather than using standard serum containing glucose, they used serum from which the glucose had been removed using dialysis and added galactose - a different form of sugar - to the media. The tumor cells metabolize galactose at a much slower rate than glucose. This changes the metabolism of the cells making them behave more like normal liver cells.

By using cells cultured with this modified serum, drugs may effectively be screened for toxicity earlier in the research process, possibly saving millions of dollars.

"It started just as a way to sensitize cells to mitochondrial activity, the cellular powerhouse, but then we realized we had a way to investigate how we are shifting cancer metabolism," Skolik said. "In short, we have found a way to reprogram cancer cells to look - and act - more like a normal cell."

The research is featured on the cover of the April issue of American Journal of Physiology-Cell Physiology. The cover image was the work of Nilay Chakraborty, Ph.D., and Jason Solocinski at the University of Michigan-Dearborn, who developed a new process to obtain live images of the distribution of energy molecules in cells, showing how cells respond to changes in the cell culture conditions.

To fully realize the effect he reported, Skolik also cultured the cells for a longer period of time than usual.

"In the past, people would do a 12-hour adaptation to this new media. But what we showed is if you culture them for 4 to 5 weeks, you have a much more robust shift," Skolik said.

"When it comes to gene expression, you get much more bang for the buck when you adapt them for a longer period."

Although the modified serum for the cultures requires the additional step of dialysis and longer culture time, it can yield benefits at later testing stages.

"You would reserve this process for key experiments or toxicity screening," Menze said. "However, if you go into a Phase 1 clinical trial and find toxicity there, it is way more expensive than using this method."

Credit: 
University of Louisville

Chronic sinus inflammation appears to alter brain activity

The millions of people who have chronic sinusitis deal not only with stuffy noses and headaches, they also commonly struggle to focus, and experience depression and other symptoms that implicate the brain's involvement in their illness.

New research links sinus inflammation with alterations in brain activity, specifically with the neural networks that modulate cognition, introspection and response to external stimuli.

The paper was published today in JAMA Otolaryngology-Head & Neck Surgery.

"This is the first study that links chronic sinus inflammation with a neurobiological change," said lead author Dr. Aria Jafari, a surgeon and assistant professor of Otolaryngology-Head & Neck Surgery at the University of Washington School of Medicine.

"We know from previous studies that patients who have sinusitis often decide to seek medical care not because they have a runny nose and sinus pressure, but because the disease is affecting how they interact with the world: They can't be productive, thinking is difficult, sleep is lousy. It broadly impacts their quality of life. Now we have a prospective mechanism for what we observe clinically."

Chronic rhinosinusitis affects about 11% of U.S. adults, according to the Centers for Disease Control and Prevention. The condition can necessitate treatment over a span of years, typically involving antibiotics. Repeated cycles of inflammation and repair thicken sinus tissues, much like calloused skin. Surgery may resolve the issue, but symptoms also can recur.

The researchers identified a study cohort from the Human Connectome Project, an open-access, brain-focused dataset of 1,206 healthy adults ages 22-35. Data included radiology image scans and cognitive/behavioral measurements.

The scans enabled them to identify 22 people with moderate or severe sinus inflammation as well as an age- and gender-matched control group of 22 with no sinus inflammation. Functional MRI (fMRI) scans, which detect cerebral blood flow and neuronal activity, showed these distinguishing features in the study subjects:

decreased functional connectivity in the frontoparietal network, a regional hub for executive function, maintaining attention and problem-solving;

increased functional connectivity to two nodes in the default-mode network, which influences self-reference and is active during wakeful rest and mind-wandering;

decreased functional connectivity in the salience network, which is involved in detecting and integrating external stimuli, communication and social behavior.

The magnitude of brain-activity differences seen in the study group paralleled the severity of sinus inflammation among the subjects, Jafari said.

Despite the brain-activity changes, however, no significant deficit was seen in the behavioral and cognitive testing of study-group participants, said Dr. Kristina Simonyan, a study co-author. She is an associate professor of otolaryngology-head & neck surgery at Harvard Medical School and director of laryngology research at Massachusetts Eye and Ear.

"The participants with moderate and severe sinus inflammation were young individuals who did not show clinically significant signs of cognitive impairment. However, their brain scans told us a different story: The subjective feelings of attention decline, difficulties to focus or sleep disturbances that a person with sinus inflammation experiences might be associated with subtle changes in how brain regions controlling these functions communicate with one another," said Simonyan.

It is plausible, she added, that these changes may cause more clinically meaningful symptoms if chronic sinusitis is left untreated. "It is also possible that we might have detected the early markers of a cognitive decline where sinus inflammation acts as a predisposing trigger or predictive factor," Simonyan said.

Jafari sees the study findings as a launch pad to explore new therapies for the disease.

"The next step would be to study people who have been clinically diagnosed with chronic sinusitis. It might involve scanning patients' brains, then providing typical treatment for sinus disease with medication or surgery, and then scanning again afterward to see if their brain activity had changed. Or we could look for inflammatory molecules or markers in patients' bloodstreams."

In the bigger picture, he said, the study may help ear-nose-throat specialists be mindful of the less-evident distress that many patients experience with chronic sinusitis.

"Our care should not be limited to relieving the most overt physical symptoms, but the whole burden of patients' disease."

Credit: 
University of Washington School of Medicine/UW Medicine

The spintronics technology revolution could be just a hopfion away

image: Artist's drawing of characteristic 3D spin texture of a magnetic hopfion. Berkeley Lab scientists have created and observed 3D hopfions. The discovery could advance spintronics memory devices.

Image: 
Peter Fischer and Frances Hellman/Berkeley Lab

A decade ago, the discovery of quasiparticles called magnetic skyrmions provided important new clues into how microscopic spin textures will enable spintronics, a new class of electronics that use the orientation of an electron's spin rather than its charge to encode data.

But although scientists have made big advances in this very young field, they still don't fully understand how to design spintronics materials that would allow for ultrasmall, ultrafast, low-power devices. Skyrmions may seem promising, but scientists have long treated skyrmions as merely 2D objects. Recent studies, however, have suggested that 2D skyrmions could actually be the genesis of a 3D spin pattern called hopfions. But no one had been able to experimentally prove that magnetic hopfions exist on the nanoscale.

Now, a team of researchers co-led by Berkeley Lab has reported in Nature Communications the first demonstration and observation of 3D hopfions emerging from skyrmions at the nanoscale (billionths of a meter) in a magnetic system. The researchers say that their discovery heralds a major step forward in realizing high-density, high-speed, low-power, yet ultrastable magnetic memory devices that exploit the intrinsic power of electron spin.

"We not only proved that complex spin textures like 3D hopfions exist - We also demonstrated how to study and therefore harness them," said co-senior author Peter Fischer, a senior scientist in Berkeley Lab's Materials Sciences Division who is also an adjunct professor in physics at UC Santa Cruz. "To understand how hopfions really work, we have to know how to make them and study them. This work was possible only because we have these amazing tools at Berkeley Lab and our collaborative partnerships with scientists around the world," he said.

According to previous studies, hopfions, unlike skyrmions, don't drift when they move along a device and are therefore excellent candidates for data technologies. Furthermore, theory collaborators in the United Kingdom had predicted that hopfions could emerge from a multilayered 2D magnetic system.

The current study is the first to put those theories to test, Fischer said.

Using nanofabrication tools at Berkeley Lab's Molecular Foundry, Noah Kent, a Ph.D. student in physics at UC Santa Cruz and in Fischer's group at Berkeley Lab, worked with Molecular Foundry staff to carve out magnetic nanopillars from layers of iridium, cobalt, and platinum.

The multilayered materials were prepared by UC Berkeley postdoctoral scholar Neal Reynolds under the supervision of co-senior author Frances Hellman, who holds titles of senior faculty scientist in Berkeley Lab's Materials Sciences Division, and professor of physics and materials science and engineering at UC Berkeley. She also leads the Department of Energy's Non-Equilibrium Magnetic Materials (NEMM) program, which supported this study.

Hopfions and skyrmions are known to co-exist in magnetic materials, but they have a characteristic spin pattern in three dimensions. So, to tell them apart, the researchers used a combination of two advanced magnetic X-ray microscopy techniques - X-PEEM (X-ray photoemission electron microscopy) at Berkeley Lab's synchrotron user facility, the Advanced Light Source; and magnetic soft X-ray transmission microscopy (MTXM) at ALBA, a synchrotron light facility in Barcelona, Spain - to image the distinct spin patterns of hopfions and skyrmions.

To confirm their observations, the researchers then carried out detailed simulations to mimic how 2D skyrmions inside a magnetic device evolve into 3D hopfions in carefully designed multilayer structures, and how these will appear when imaged by polarized X-ray light.

"Simulations are a hugely important part of this process, enabling us to understand the experimental images and to design structures that will support hopfions, skyrmions, or other designed 3D spin structures," Hellman said.

To understand how hopfions will ultimately function in a device, the researchers plan to employ Berkeley Lab's unique capabilities and world-class research facilities - which Fischer describes as "essential for carrying out such interdisciplinary work" - to further study the quixotic quasiparticles' dynamical behavior.

"We have known for a long time that spin textures are almost inevitably three dimensional, even in relatively thin films, but direct imaging has been experimentally challenging," said Hellman. "The evidence here is exciting, and it opens doors to finding and exploring even more exotic and potentially significant 3D spin structures."

Credit: 
DOE/Lawrence Berkeley National Laboratory

Artificial Intelligence could 'crack the language of cancer and Alzheimer's'

image: Fluorescence microscopy image of protein condensates forming inside living cells.

Image: 
Weitz lab, Harvard University

Powerful algorithms used by Netflix, Amazon and Facebook can 'predict' the biological language of cancer and neurodegenerative diseases like Alzheimer's, scientists have found.

Big data produced during decades of research was fed into a computer language model to see if artificial intelligence can make more advanced discoveries than humans.

Academics based at St John's College, University of Cambridge, found the machine-learning technology could decipher the 'biological language' of cancer, Alzheimer's, and other neurodegenerative diseases.

Their ground-breaking study has been published in the scientific journal PNAS today (April 8 2021) and could be used in the future to 'correct the grammatical mistakes inside cells that cause disease'.

Professor Tuomas Knowles, lead author of the paper and a Fellow at St John's College, said: "Bringing machine-learning technology into research into neurodegenerative diseases and cancer is an absolute game-changer. Ultimately, the aim will be to use artificial intelligence to develop targeted drugs to dramatically ease symptoms or to prevent dementia happening at all."

Every time Netflix recommends a series to watch or Facebook suggests someone to befriend, the platforms are using powerful machine-learning algorithms to make highly educated guesses about what people will do next. Voice assistants like Alexa and Siri can even recognise individual people and instantly 'talk' back to you.

Dr Kadi Liis Saar, first author of the paper and a Research Fellow at St John's College, used similar machine-learning technology to train a large-scale language model to look at what happens when something goes wrong with proteins inside the body to cause disease.

She said: "The human body is home to thousands and thousands of proteins and scientists don't yet know the function of many of them. We asked a neural network based language model to learn the language of proteins.

"We specifically asked the program to learn the language of shapeshifting biomolecular condensates - droplets of proteins found in cells - that scientists really need to understand to crack the language of biological function and malfunction that cause cancer and neurodegenerative diseases like Alzheimer's. We found it could learn, without being explicitly told, what scientists have already discovered about the language of proteins over decades of research."

Proteins are large, complex molecules that play many critical roles in the body. They do most of the work in cells and are required for the structure, function and regulation of the body's tissues and organs - antibodies, for example, are a protein that function to protect the body.

Alzheimer's, Parkinson's and Huntington's diseases are three of the most common neurodegenerative diseases, but scientists believe there are several hundred.

In Alzheimer's disease, which affects 50 million people worldwide, proteins go rogue, form clumps and kill healthy nerve cells. A healthy brain has a quality control system that effectively disposes of these potentially dangerous masses of proteins, known as aggregates.

Scientists now think that some disordered proteins also form liquid-like droplets of proteins called condensates that don't have a membrane and merge freely with each other. Unlike protein aggregates which are irreversible, protein condensates can form and reform and are often compared to blobs of shapeshifting wax in lava lamps.

Professor Knowles said: "Protein condensates have recently attracted a lot of attention in the scientific world because they control key events in the cell such as gene expression - how our DNA is converted into proteins - and protein synthesis - how the cells make proteins.

"Any defects connected with these protein droplets can lead to diseases such as cancer. This is why bringing natural language processing technology into research into the molecular origins of protein malfunction is vital if we want to be able to correct the grammatical mistakes inside cells that cause disease."

Dr Saar said: "We fed the algorithm all of data held on the known proteins so it could learn and predict the language of proteins in the same way these models learn about human language and how WhatsApp knows how to suggest words for you to use.

"Then we were able ask it about the specific grammar that leads only some proteins to form condensates inside cells. It is a very challenging problem and unlocking it will help us learn the rules of the language of disease."

The machine-learning technology is developing at a rapid pace due to the growing availability of data, increased computing power, and technical advances which have created more powerful algorithms.

Further use of machine-learning could transform future cancer and neurodegenerative disease research.
Discoveries could be made beyond what scientists currently already know and speculate about diseases and potentially even beyond what the human brain can understand without the help of machine-learning.

Dr Saar explained: "Machine-learning can be free of the limitations of what researchers think are the targets for scientific exploration and it will mean new connections will be found that we have not even conceived of yet. It is really very exciting indeed."

The network developed has now been made freely available to researchers around the world to enable advances to be worked on by more scientists.

Credit: 
St. John's College, University of Cambridge