Earth

Mayo researchers recommend all women with breast cancer diagnosis under age 66

ROCHESTER, Minn. -- A study by researchers at Mayo Clinic published this week in the Journal of Clinical Oncology suggests that all women with a breast cancer diagnosis under the age of 66 be offered germline genetic testing to determine if they have a gene mutation known to increase the risk of developing other cancers and cancers among blood relatives. Current guidelines from the National Comprehensive Cancer Network (NCCN) recommend germline testing for all women diagnosed with breast cancer under the age of 46 regardless of their family history and breast cancer subtype.

"There is considerable confusion regarding the best method for selecting who may benefit from hereditary cancer genetic testing from among all women diagnosed with breast cancer," says Fergus Couch, Ph.D., a breast cancer researcher at Mayo Clinic. "The NCCN has very specific guidelines for who may benefit from genetic testing based on the age of diagnosis and family history of certain cancers while the American Society of Breast Surgeons (ASBrS ) recommends testing all women with breast cancer."

For their study, Dr. Couch and his colleagues evaluated all known breast cancer predisposition genes in a Mayo Clinic breast cancer registry and showed that NCCN guidelines overlooked approximately 30% of patients with genetic mutations known to increase the risk of developing breast cancer.

Based on this information Dr. Couch and his colleagues recommend increasing the age for genetic testing to all women diagnosed with breast cancer under the age of 66 irrespective of family history of cancer . "This change would help identify 98% of women with BRCA1 and BRCA2 mutations, and more than 90% of women with mutations in other predisposition genes, while avoiding testing of 20% of all breast cancers," says Dr. Couch. He says this approach may also result in a reduced burden on the genetic services needed for women receiving testing.

"We were surprised to find that the NCCN guidelines missed approximately 30% of mutation carriers in breast cancer predisposition genes," says Siddhartha Yadav, M.B.B.S., a medical oncologist and first author of the study. "A few recent studies have demonstrated that NCCN guidelines could miss a substantial number of mutation carriers. However, these studies included several genes that are not typically associated with breast cancer risk. Our study was appropriately restricted to nine breast cancer predisposition genes with clear management guidelines."

Dr. Couch says it was encouraging to note that by simply changing the age cutoff for germline genetic testing in women with breast cancer, rather than other more complicated approaches, it should be possible to identify the majority of mutation carriers. In women diagnosed with breast cancer over the age of 65, the study supports the use of NCCN guidelines to decide who should undergo germline genetic testing. The overall result is that many more women and their family members can benefit from knowing that they are at increased risk of cancer.

Credit: 
Mayo Clinic

Could targeting an Alzheimer's-associated protein prevent Autism?

image: Gladstone researchers Lennart Mucke and Chao Tai discovered a potential role for the Alzheimer's disease-related protein tau in autism spectrum disorders

Image: 
Lauren Bayless, Gladstone Institutes

SAN FRANCISCO, CA March 2, 2020--Autism manifests in myriad forms. Symptoms and severity vary from person to person, but all autism spectrum disorders share three core symptoms: impaired social interactions, communication deficits, and excessive repetitive behaviors.

No existing medications can adequately treat the core symptoms, and new treatments are urgently needed.

Now, researchers at the Gladstone Institutes report in Neuron that reducing levels of a protein called tau prevents the core symptoms from arising in mouse models simulating different forms of autism spectrum disorders. Tau reduction in these mice also prevented seizures, which occur in 30 percent of people with autism.

"Our findings suggest that tau reduction holds promise as a potential treatment for some forms of autism," said Lennart Mucke, MD, the founding director of the Gladstone Institute of Neurological Disease and senior author of the new paper.

Tau has never before been linked to autism, but it is known for its role in Alzheimer's disease and other neurodegenerative conditions.

"We've uncovered an unexpected new connection between degenerative diseases of aging and developmental disorders of childhood," said Mucke, who is also a professor of neurology and neuroscience at the University of California, San Francisco.

Making the Leap from Alzheimer's to Autism

These surprising findings stem from an investigation of links between Alzheimer's disease and epilepsy. Originally, Mucke and colleagues showed that tau reduction prevents epileptic activity and cognitive deficits in mouse models of Alzheimer's disease and of Dravet syndrome, a severe childhood epilepsy.

These findings intrigued Chao Tai, PhD, a scientist on Mucke's team and first author of the new paper, who had previously studied Dravet syndrome at the University of Washington.

"We wondered whether tau reduction could also prevent the signs of autism that are often seen in people with Dravet syndrome," Tai said.

To explore this possibility, the team investigated a mouse model of Dravet syndrome after deleting one or both copies of the gene that encodes tau. They found that reducing tau indeed prevented the development of core autism symptoms in this model. Even 50 percent reduction of tau showed major benefits.

Because the causes of autism are so diverse, the researchers next tested the effect of tau reduction in a second mouse model of autism that results from a very different genetic mutation.

Sure enough, "it also worked beautifully," Tai said. "The autism-like behaviors were again strongly suppressed by tau reduction."

Tau reduction also prevented epilepsy, as well as two additional abnormalities seen in people with autism and related mouse models: an enlargement of the brain known as megalencephaly, and over-activation of the PI3K-Akt-mTOR signaling pathway, which regulates many important cell functions.

Finally, the researchers investigated the molecular mechanisms by which tau reduction prevents these abnormalities. They found that tau reduction enhances the activity of a powerful enzyme called PTEN, which can prevent overactivation of the autism-promoting signaling pathway.

Addressing a Dire Need for Novel Treatments

The new findings highlight the potential of tau reduction to counteract both neurologic and psychiatric disorders. "Tau reduction appears to be the first strategy that can prevent both autism and epilepsy, two challenging conditions that all too often afflict the same people," Mucke said.

"Autism is very frequent, affecting roughly one in 60 children," Tai said. "Our findings could help address the urgent need for the development of better therapeutic strategies."

However, as the paper also demonstrates, tau reduction will probably not be effective against all forms of autism, and more research would be needed to determine exactly who might benefit.

In addition, the alterations that cause autism likely affect the brain during early stages of development and well before the diagnosis is typically made. "We are therefore eager to investigate whether tau reduction can also reverse symptoms of autism after they have emerged," Mucke said. "Ongoing studies in experimental models should help clarify optimal times for administering tau-lowering drugs to prevent or treat symptoms."

Mucke and colleagues are developing and testing small-molecule drugs that could lower tau levels or increase the activity of PTEN. Other investigational tau-lowering approaches are already being tested in people as a potential treatment for Alzheimer's disease, and this new research suggests that findings from those trials may also be informative in regard to autism.

Credit: 
Gladstone Institutes

'Digital disruption' a game-changer for climate: Future Earth report

image: The report, "Digital Disruptions for Sustainability" is available post-embargo at sustainabilitydigitalage.org

Image: 
Future Earth

Youth on the streets are calling for "systems change, not climate change." And, according to a new report by Future Earth, the digital transformations unfolding today could help answer this call.

Historically, the report says, climate and digital agendas have been approached as two independent issues but increasingly are recognized as intertwined. Humans are connected to each other through and dependent on both the digital and the natural worlds.

Global systemic risks are likely to emerge from both these worlds if we fail to act urgently and continue on our current trajectory.

Yet within this link lies an opportunity to re-shape our everyday interactions with each other and the natural world, the way we conduct business, and how we govern our society, to meet the climate crisis.

The report, Digital Disruptions for Sustainability (D^2S, available post-embargo at https://sustainabilitydigitalage.org), explores these interconnected agendas and highlights research, innovation, and actions needed to drive societal transformations in support of a more sustainable and equitable world. This report was developed over a year-long collaborative process with input from more than 250 sustainability and digital experts worldwide from academia, business, and civil society.

"Climate strategies tend to focus on targeting investments on emission reductions by sector," says Amy Luers, Executive Director of Future Earth and the project's leader.

"This sector-based work is critical, of course, but insufficient to meet our climate goals. This is because while research indicates that deep decarbonization is technically possible, we have not yet figured out how to steer society onto a deep decarbonization path. More research and innovation on this issue are urgently needed."

"This is the focus of the D^2S Agenda. It approaches climate as a social challenge. Rather than focus on the high carbon-emitting sectors, the Agenda focuses on the rules, norms, power structures, and mindsets underpinning all sectors and constraining climate actions."

It explores the opportunities and challenges of leveraging new capabilities of the digital age to break these constraints and drive rapid and unprecedented societal transformations needed to achieve the Paris Agreement climate goals.

Examples include leveraging the digital age to decentralize power from the top and empower more stakeholders, to shift social norms of consumption toward low carbon products, and to reshape society's mindset from more efficient fossil fuel use to shifting off them.

"The initial promise of the digital revolution was democratized information, more accountable governments through broader citizen participation, and the growth of a more equitable and greener economy," says Dr. Luers. "Yet many of these aspirations have not been realized, because society failed to anticipate how the digital revolution would unfold. As a result, today the digital world threatens individual rights, human dignity, social justice, the future of democracy, and environmental sustainability."

According to the D^2S Agenda, it is not too late to change course.

Artificial intelligence, coupled with a broader range of digital tools, still have the potential to change for the better our economic systems, our governance systems, and even our cognitive systems - in short, to achieve the systems change that young protesters are demanding. But it will take a conscious collective effort to make that happen.

As Dirk Messner, President of the German Environment Agency, and a collaborator on the project, comments in the report, "We will only achieve our sustainability goals if digitalization is consciously geared towards them."

The D^2S Agenda outlines a framework for this collective work.

It centers around steering the effect of four "digital disruptors" that are altering fundamentally the power, rules, and mindsets of society today.

The 4 "Digital Disruptors"

Unprecedented Transparency:
Satellites and other remote sensors in cell phones and elsewhere are making transparency the norm and privacy harder to protect.

Mass Collaboration:
The social web and the rapid spread of mobile devices are giving rise to new ways to collaborate around the world.

Intelligent Systems:
Big data, machine learning capabilities, and cloud computing have enabled smart systems that combine human and machine intelligence.

Mixed Reality:
Virtual and augmented reality are merging the physical and virtual worlds, shifting how we engage with each other and the environment.

Four digital disruptors are already driving transformations in our social and economic systems. Now, scientists, tech innovators, policy and business leaders, and citizens must collaborate consciously to steer these disruptions to drive transformations to a sustainable, climate-safe, and equitable world.

While these digital disruptors are already driving societal transformations at an unprecedented scale and pace, they are not on track to build a climate safe and equitable world. The D^2S Agenda outlines an initial set of research, innovation, and action priorities to make that shift, summarized below (high-res here).

This work will require collaboration across disciplines and sectors, including efforts from the private sector to build these partnerships.

For example, says Lucas Joppa, the Chief Environment Officer at Microsoft, and an advisor on the D^2S Agenda: "By accelerating investment and deployment of AI solutions, we have the potential not only to mitigate climate-related risk for our businesses, but to fundamentally transform how we manage Earth's natural resources for a more prosperous and climate-stable future."

While the private sector is a critical part of the solution, the D^2S Agenda highlights that success depends on involving all sectors of society, including the most marginalized, in digital transformations to achieve climate solutions.

As Leena Srivastava, Deputy Director of the International Institute for Applied Systems Analysis (IIASA) and a co-chair of Future Earth Advisory Committee, writes, "Sustainability calls for digital empowerment of the poor; not digital empowerment for the poor."

As a result tackling the climate crisis and working towards a just and equitable digital future are inherently interconnected agendas.

The D^2S Agenda is part of a new initiative - Sustainability in the Digital Age - which seeks to support and strengthen the growing diversity of actors engaging with the interconnected digital and sustainability agendas, a critical step in driving changes we need and to build a more sustainable and equitable world.

The framework of the D^2S Agenda is sketched out in an animated video here: http://bit.ly/2VnZRe7

Advisors / collaborators reflect on the D^2S Agenda

"Data is not the new oil - it's the new plutonium. Amazingly powerful, dangerous when it spreads, difficult to clean up and with serious consequences when improperly used. Data governance is therefore more urgent as a policy challenge than climate change because abuse of data compromises the very democratic processes on which we rely to intelligently and effectively address challenges like climate change. The Digital Disruptions for Sustainability Agenda provides a helpful framework for understanding the powerful connection between the data governance and the climate agendas, and highlights important work needed to move forward on both."

Jim Balsilie
Canadian Council of Innovators; member, Future Earth Advisory Committee

"Climate change is humanity's biggest crisis. A critical obstacle to addressing this crisis is that, despite the growing intensity of extreme weather events, to many people climate impacts still often seem distant and abstract. Machine Learning and interactive technologies could help make climate risks more concrete and more personal. Our hope is that these technologies will enable the scaling of more targeted and personalized public engagement strategies that could ultimately strengthen collective action."

Yoshua Bengio
A.M. Turing Award Winner, 2018; MILA; University of Montreal

"Many are optimistic about the role of unprecedented levels of transparency in securing more accountable and effective global sustainability governance. Yet, research suggests that transparency may not be all that it promises to be. For example, transparency is often assumed to be essential to trust, however, the opposite might well hold: there might need to be trust first, in order to have meaningful transparency. And thus it is critical to research not only the design of transparency systems, but also the normative and political contexts within which such systems are deployed, as these shape whether and under what conditions transparency may realize its transformative potential in global sustainability governance."

Aarti Gupta
Professor, Wageningen University

"Digital technologies are enabling unprecedented transparency of lifecycle impact data of raw materials, products, and supply chains and present new platforms to channel consumer behavior into market signals to activate demand for sustainable products. In order to steer towards this opportunity, it is imperative to advance dialogues around the role of government and other actors in the digital economy."

Tom Hassenboehler
Partner, The Coefficient Group; Executive Director and Founder, EC-MAP

"As we work to implement decarbonization strategies, we are proactively working with partners to leverage the power of data and artificial intelligence to be part of the broader solution of building a climate-safe world."

Ravi Jain
VP Search Science and AI, Amazon

"At ClimateWorks, we've been exploring how alternative futures might impact climate strategies. One critical disruptive force is the digital revolution, which is creating new challenges but may also offer huge opportunities to drive systems change and accelerate climate action. The D^2S Agenda sets out a valuable framework for leveraging the digital revolution to achieve positive change."

Charlotte Pera
President and CEO, ClimateWorks Foundation

"We need to focus on harnessing the potential of the digital sector for global public benefit. This will require public-private partnerships to both support the development of public benefit data and services, and to build the institutional and regulatory context needed to steer the digital transformations underway to both empower business and support the wellbeing of people and the planet."

Asun Lera St. Clair
Senior Principal Scientist, DNV GL; member, Future Earth Advisory Committee

Credit: 
Terry Collins Assoc

Early Earth may have been a 'waterworld'

image: Benjamin Johnson inspects an outcrop in the Panorama district by what was once an ancient hydrothermal vent.

Image: 
Jana Meixnerova

Kevin Costner, eat your heart out. New research shows that the early Earth, home to some of our planet's first lifeforms, may have been a real-life "waterworld"-- without a continent in sight.

The study, which appears March 2 in Nature Geoscience, takes advantage of a quirk of hydrothermal chemistry to suggest that the surface of Earth was likely covered by a global ocean 3.2 billion years ago. It may even have looked a bit like the post-apocalyptic, and land-free, future imagined in Costner's infamous film Waterworld.

The group's findings could help scientists to better understand how and where single-cell organisms first emerged on Earth, said Boswell Wing, a coauthor of the research.

"The history of life on Earth tracks available niches," said Wing, an associate professor in the Department of Geological Sciences at the University of Colorado Boulder. "If you've got a waterworld, a world covered by ocean, then dry niches are just not going to be available."

The study also feeds into an ongoing debate over what ancient Earth may have looked like: Was the planet much hotter than it is today?

"There was seemingly no way forward on that debate," said lead author Benjamin Johnson, who conducted the research during a postdoctoral position in Wing's lab at CU Boulder. "We thought that trying something different might be a good idea."

A crazy place

For him and Wing, that something different centered around a geologic site called the Panorama district located deep in Northwestern Australia's outback.

"Today, there are these really scrubby and rolling hills that are cut through by dry river beds," said Johnson, now an assistant professor at Iowa State University in Ames. "It's a crazy place."

It's also the resting spot for a 3.2 billion-year-old chunk of ocean crust that's been turned on its side.

In the span of a day at Panorama, you can walk across what used to be the hard, outer shell of the planet--all the way from the base of that crust to the spots where water once bubbled up through the seafloor via hydrothermal vents.

The researchers saw it as one-of-a-kind opportunity to pick up clues about the chemistry of ocean water from billions of years ago.

"There are no samples of really ancient ocean water lying around, but we do have rocks that interacted with that seawater and remembered that interaction," Johnson said.

The process, he explained, is like analyzing coffee grounds to gather information about the water that poured through it. To do that, the researchers analyzed data from more than 100 rock samples from across the dry terrain.

They were looking, in particular, for two different flavors--or "isotopes"--of oxygen trapped in stone: a slightly heavier atom called Oxygen-18 and a lighter one called Oxygen-16.

The duo discovered that the ratio of those two isotopes of oxygen may have been a bit off in seawater 3.2 billion years ago--with just a smidge more Oxygen-18 atoms than you'd see today.

"Though these mass differences seem small, they are super sensitive," Wing said.

Lost at sea

Sensitive, it turns out, to the presence of continents. Wing explained that today's land masses are covered by clay-rich soils that disproportionately take up heavier oxygen isotopes from the water--like mineral vacuums for Oxygen-18.

The team theorized that the most likely explanation for that excess Oxygen-18 in the ancient oceans was that there simply weren't any soil-rich continents around to suck the isotopes up. That doesn't mean, however, that there weren't any spots of dry land around.

"There's nothing in what we've done that says you can't have teeny, micro-continents sticking out of the oceans," Wing said. "We just don't think that there were global-scale formation of continental soils like we have today."

Which leaves a big question: When did plate tectonics push up the chunks of rock that would eventually become the continents we know and love?

Wing and Johnson aren't sure. But they're planning to scour other, younger rock formations at sites from Arizona to South Africa to see if they can spot when land masses first roared onto the scene.

"Trying to fill that gap is really important," Johnson said.

For now, Costner may want to start planning the prequel.

Credit: 
University of Colorado at Boulder

Carbon chains adopt fusilli or spaghetti shapes if they have odd or even numbers

image: The image shows how the conformation (shape) of our carbon chains alternate between ordered and chaotic structures as the carbon chain alternates between having even and odd numbers of atoms.

Image: 
University of Bristol

Helical shapes are very familiar in the natural world and, at the molecular level, of DNA, the very blueprint of life itself.

Scientists at the University of Bristol have now found that carbon chains can also adopt helical shapes, but, unlike DNA, the shape is dependent on how many atoms there are in the chain, with chains having even numbers of carbon atoms adopting helical, fusilli-like shapes and chains with odd numbers of carbon atoms adopting floppy, spaghetti-like shapes.

The difference, say the research team, between order and chaos is a single carbon atom. Their study is published today in the journal Nature Chemistry.

Carbon chains are like spaghetti - they are rather floppy and adopt a set of random and constantly changing shapes.

The Bristol team, from the University's School of Chemistry, showed that by judicious insertion of methyl substituents along carbon chains they could control their shape so that they adopted well-defined linear (penne) or helical (fusilli) conformations.

The helical conformations can adopt either right or left-handed helices and the team were interested to know what controlled which helix was formed.

Lead author, Professor Varinder Aggarwal, said: "We were astonished to find that the length of the carbon chain (number of carbon atoms) controlled whether the right or left-handed helix formed.

"Even more surprising was that carbon chains with even numbers of atoms formed well-defined helical structures (fusilli) but odd numbered carbon chains were much floppier and more random in shape (spaghetti).

"The change in properties of a homologous series of molecules caused by the single addition of an extra carbon atom is extremely rare - here it results in the difference between order and chaos."

This type of odd-even effect has been observed in some bulk properties, such as in carpets of alkanethiols on a gold surface, but such behaviours in solution are not well recognised or understood.

Through computation and measurement of molecular properties, Professor Aggarwal and his team have been able to fully understand the origin of this odd-even effect which is controlled by the end groups.

When the end groups both promote the same sense of helicity, an ordered structure is obtained, but when each end promotes an opposite helix, chaotic structures are obtained.

For future technological applications, these fundamental findings will guide the design of molecules with desirable conformational, and physical properties.

Carbon chains with an even number of atoms will lead to molecules with well-defined helical shapes for their application as non-switchable rigid materials or as scaffolds for the presentation of molecular recognition elements.

Helices are a fundamental structure in biological molecules (DNA, proteins) and it is intriguing to imagine the analogies to molecules of the sort described in the study.

Professor Aggarwal added: "Carbon chains with an odd number of atoms were found to adopt floppier and more random shape.

"We are now studying whether the shape of these chains can in fact be controlled by manipulating the groups at the ends of the chain. This may enable us to switch from one screw-sense to another for applications in responsive materials."

Credit: 
University of Bristol

The fantastical Adelaide Fringe

image: Events are at the very heart of competing for financial and human capital.

Image: 
University of SA / Emma Mellett

From sky-high acrobatics to sultry-sequined burlesque, Adelaide's annual Fringe festival has long been transforming the city into an eclectic and vibrant hive of activity, attracting millions of visitors and directing millions of dollars into the South Australian economy.

Yet, beyond the economics, new study from the University of South Australia shows that the Adelaide Fringe also plays a crucial role in building the State's social capital, a factor which is helping to combat South Australia's 'brain drain'.

As the South Australian population continues to age, the issue of 'brain drain' remains prominent. According to the Australia Bureau of Statistics, an annual average of 3000+ young South Australians (aged 25-29 years) migrate intestate in search of more fruitful opportunities.

UniSA's tourism and event experts, Dr Chris Krolikowski and Dr Sunny Lee, say the new study shows how the Adelaide Fringe festival is growing the city's social capital through placemaking - the process of shaping and reimagining public spaces to create meaning for local residents and visitors.

"Building peoples' connection and affinity with places in South Australia is essential if we want to effectively tackle the challenge of 'brain drain'," Dr Krolikowski says.

"The Adelaide Fringe achieves this in multiple ways - not only through the kaleidoscope of performances that add colour, excitement, and atmosphere to our city and surrounds, but also through the regeneration of urban spaces, and that gives people a reason to visit and engage.

"By creating vibrant and engaging experiences and places for residents and visitors, the Fringe is building meaningful connections with locals and visitors, generating positive word-of-mouth and showcasing South Australia as a great destination to work, live and visit.

"The cumulation of these efforts contributes to a growing sense of pride - a very powerful factor in people's positive connection with a place, and a key combatant against brain drain."

Now in its 60th year, the Adelaide Fringe is the largest arts festival in the Southern Hemisphere. In 2019, the Adelaide Fringe increased tourist attendance by 72 per cent, and generated $95 million in expenditure for the state economy. In their annual reports, 96.1 per cent of SA-based respondents said that the Fringe was culturally important to South Australia.

Dr Lee says while other Australian cities may have greater financial resources to invest in iconic attractions and events, Adelaide has a unique competitive advantage in its open spaces that are primed for events.

"Building competitive advantage is important for all destinations, but in South Australia, where we have plentiful open spaces close to the key city hubs, we're in a prime position to host events and concurrently build social cohesion and belonging.

"The challenge for stakeholders, however, is to take a holistic and longitudinal view of events and to understand that they are at the very heart of competing for financial and human capital. This is key to arresting South Australia's brain drain."

Credit: 
University of South Australia

Hydrogen energy at the root of life

image: Hydrothermal vent in the 'Lost City' Hydrothermal Field in the Atlantic.

Image: 
Susan Lang, U. of SC. / NSF / ROV Jason / 2018 © Woods Hole Oceanographic Institution

Since the discovery of submarine hydrothermal vents around 40 years ago, these natural chemical reactors have been a focus for evolutionary researchers searching for the origin of life. The vents emit hot water containing minerals, including simple but reactive chemical substances such as hydrogen gas (H2) and carbon dioxide (CO2). Conditions like these could have resulted in the very first biochemical reactions, and thus in the emergence of the first free living cells.

The starting point of such primitive metabolism of the first microbes is carbon dioxide and hydrogen gas. Microbes that live from these substances first convert the two gases into formic acid (formate), acetate and pyruvate (salts of acetic acid and pyruvic acid). They then use these to make all of their organic material through a dense roadmap of complicated reactions. What chemist Dr. Martina Preiner from the Institute of Molecular Evolution of Heinrich Heine University Düsseldorf (HHU) and an international team now show is that precisely these basic building blocks of life emerge all by themselves in a lab environment when H2 and CO2 are left to react in the presence of simple minerals in hydrothermal conditions.

For the last 20 years, Prof. Dr. William Martin, Head of the Institute of Molecular Evolution, has been cataloguing the many parallels between metal-catalysed reactions in metabolism and chemical reactions at hydrothermal vents. Prof. Martin says: "These reactions based on H2 and CO2 that reflect the origins of the first biochemical processes can now be simulated in a lab in Düsseldorf, allowing us to mimic the earliest development phases of life."

Together with researchers from the Max Planck Institute for Coal Research in Mülheim/Ruhr, the University of Strasbourg and the National Institute of Advanced Industrial Science and Technology in Japan, Dr. Preiner has simulated these very simple reactions in a lab environment. They have been able to demonstrate that H2 and CO2 organize into formate, acetate and pyruvate overnight at temperatures of 100 degrees Celsius as long as a few simple mineral catalysts are present - catalysts that themselves are formed in hydrothermal vents. Martina Preiner emphasises the fact that no microbial metabolism is needed. "The chemical reactions are surprisingly simple. The main products created are exactly those used by the earliest cells as a basis for their further metabolism."

Dr. Harun Tüysüz and his team from the Max Planck Institute in Mülheim have designed nanostructured solid catalysts for the experiments: "We observe a distinct structure-activity relationship of the solid catalysts for CO2 reduction in the origins of life context."

Serendipity would have it that two other working groups were also studying similar processes. A team of Strasbourg-based chemists working under Prof. Dr. Joseph Moran and Dr. Kamila Muchowska were using metallic iron instead of H2. The Japanese team working with microbiologist Dr. Kensuke Igarashi was examining reactions of H2 and CO2 over iron-sulphide catalysts. All of the groups observed the same products. Prof. Moran says: "Metabolism appears to have developed in a surprisingly natural way".

The origins of life leads to a 'chicken or egg' problem. In addition to the simple CO2-H2 reactions, cells have to form a large number of more complex modules in order to grow and function. Modern cells generally feature proteins as catalysts, the construction of which is encoded in the genes. But what came first, the proteins or the nucleic acids? The study now published suggests that they were preceded in evolution by reactions catalysed by metals and minerals, and that both proteins and nucleic acids emerged from those reactions. The metals found in modern proteins are relicts of these biochemical beginnings.

The study also sheds important light on the classical problem in origin of life research: What energy was available to the earliest life forms? Preiner and colleagues showed that the reactions of H2 with CO2 under the same conditions as those prevalent in hydrothermal vents also release energy. When producing simple compounds such as acetate, enough energy is created to allow primitive microbes to fuel their further metabolism.

This means that the fuel for original cells was hydrogen, which was formed in huge quantities in submarine environments when the Earth was young and continues to be formed today. Not only is hydrogen the cleanest of all energy forms - producing just water when combusted - it may also be the spark that created life itself. The decisive factors were having the right conditions and the right catalysts in place.

Credit: 
Heinrich-Heine University Duesseldorf

Epoxy resins: Hardening at the push of a button

video: The new resin is hardened unter water.

Image: 
TU Wien

Within seconds the new material can be completely transformed. Initially, the material is transparent and either in liquid or paste form; then, when any part of it is irradiated with the appropriate light, the entire resin begins to solidify and takes on a dark colour. The special epoxy resin formula that makes this possible has been patented by TU Wien. Now, researchers have even successfully carried out the process underwater. This means that the new epoxy resin can be used for jobs that, up until now, had been very difficult to carry out, such as filling underwater cracks in bridge pillars or dams, or repairing pipes during ongoing operation.

As a further novelty, the special formula can be applied in combination with carbon fibres and carbon fibre mats. Many possibilities arise for applications in aerospace engineering, wind turbines, shipbuilding or automotive industry - in every field where highest mechanical properties need to be combined with lightweight design.

Ordinary material with an extraordinary addition

Epoxy resins are standard materials that are used in the industrial sector for many different purposes, such as insulating electronic components or securing mechanical parts. The research group headed up by Professor Robert Liska (Institute of Applied Synthetic Chemistry, TU Wien) develops additives that are added to ordinary epoxy resin in order to adjust its properties and enable targeted curing at the touch of a button.

"We are developing special compounds in which light triggers a chemical reaction", explains Robert Liska." This can be a bright flash of visible light, but we also have compounds which only react to UV light." At the point where the light strikes the resin, a reaction is started that releases heat. This heat spreads and initiates a chemical cascade elsewhere until all the resin has been cured.

"The key advantage of this method is that it isn't necessary to illuminate the entire resin as with other light-curing materials", explains Liska. "It's sufficient to irradiate any part of the resin with light. The rest then cures even if it's situated deep in a dark crack that you want to fill." "Furthermore, it has to be mentioned that the whole formulation has a nearly unlimited storage stability, which makes processing significantly easier compared to state of the art materials"

Partner companies from industry have enquired whether this process would also be possible in presence of "dark" fillers or fibres as self-curing epoxy resin would be extremely useful for some of these more difficult applications. "On the surface, this idea contradicts all theories", thinks Liska. "The light has a very low penetration depth into the material because it is strongly absorbed by the carbon fibres", still experiments at TU Wien impressively showed the working process.

Also the efficient underwater curing contradicts the theory. "Initially we didn't think it would be possible. One would first expect that the water would chemically react with the components of the resin, and also that it would remove the heat required to sustain the reaction." Surprisingly, however, it was still possible for the light-triggered self-curing process to take place underwater.

"A key reason for this is that the chemical reaction brings the water to the boil", explains Dr Patrick Knaack, senior scientist at the same institute. "A thin protective layer of water vapour then forms between the hardening resin and the surrounding water."

Researchers are now looking for further users from industry to explore the potential of this special resin. There is currently financial support from the Austria Science Fund FFG in the framework of the "Spin-off Fellowship" program with the aim to create a start-up company in late summer 2020. Besides the application of glass- and carbon fibre-reinforced composites in aerospace, shipbuilding and automotive manufacturing, the restoration of buildings is a particularly interesting area. For example, you could fill cracks in buildings that are built in water with viscous resin and then cure them with a flash of light. The maintenance of pipelines is another job that is often difficult to carry out - the use of the new resin could also be suitable here. "There are many possibilities and we are hoping for some interesting new ideas", says Patrick Knaack.

Credit: 
Vienna University of Technology

Marine cyanobacteria do not survive solely on photosynthesis

image: The researcher team that carried out the study

Image: 
University of Córdoba

Marine cyanobacteria are single-cell organisms that settled in the oceans millions of years ago. They are organisms that, by means of photosynthesis, create organic material by using inorganic substances. Specifically, the cyanobacteria known as Prochlorococcus and Synechococcus are the most abundant photosynthetic organisms on Earth and they generate a large part of the oxygen necessary for life, hence the oceans are the Earth's real lungs.

Despite the relevance of marine cyanobacteria in the origin and sustainment of life, they continue to be a neverending source of information. In fact, we knew nothing about Prochlorococcus until Professor Sallie W. Chisholm, at the Massachusetts Institute of Technology, discovered it in the 1980's. At the time, it was thought that these life-creating organisms got their nourishment solely from photosynthesis (like autotrophic organisms). However, research has shown that they also feed on organic compounds from their environment.

This hypothesis is corroborated by a recently published review article in The ISME Journal, a Nature group journal, led by researchers María del Carmen Muñoz and Guadalupe Gómez, from the Department of Biochemistry and Molecular Biology at the University of Cordoba, who, along with researchers from the Adaptations in the metabolism of nitrogen and carbon in Prochlorococcus group, Antonio López, José Ángel Moreno, Jesús Díez and José Manuel García, analyzed different studies dating back to the beginning of this century. These studies produced evidence that these organisms not only get nourishment from photosynthesis, but also are able to "eat" what they need from their environment.

This research group studied the mechanism of glucose transport in marine cyanobacteria, demonstrating that when they find compounds of this kind that they find appealing, such as glucose, amino acids and compounds containing iron, sulphur and phosphorus, these organisms consume them and become more competitive.

The study of vesicles (small compartments that store compounds) that spread marine cyanobacteria also supports this finding: vesicles contain organic compounds that can feed other bacteria, which provides evidence of how important the use of organic compounds is among these organisms.

This concept change is crucial at an environmental level since it helps to better understand cycles of elements such as carbon, iron, phosphorus and nitrogen. The essential role that cyanobacteria play when producing necessary oxygen for life and sequestering excess carbon dioxide from the atmosphere is strengthened by this review of its nourishment: if cyanobacteria have an advantage by using glucose and other organic compounds taken from their environment, life on Earth also benefits from these advantages.

Credit: 
University of Córdoba

Gold in limbo between solid and melted states

image: An illustration of grain boundary locations (points where lines intersect) in a polycrystalline gold thin film. The zoomed-in view shows how a melt front created at these boundaries propagates into the grains after the film is excited with an optical laser.

Image: 
Brookhaven National Laboratory

UPTON, NY--If you heat a solid material enough, the thermal energy (latent heat) causes the material's molecules begin to break apart, forming a liquid. One of the most familiar examples of this phase transition from a well-ordered solid to less-ordered liquid state is ice turning into water.

Though melting is a fundamental process of matter, scientists have not been fully able to understand how it works at a microscopic level, owing to the lack of research capabilities with sufficient time resolution. However, the advent of x-ray free-electron lasers (XFELs) in the past decade is making the study of the mechanism of melting, as well as other ultrafast atomic-scale dynamics, possible. These instruments use free (unbound) electrons to generate femtosecond (one-quadrillionth of a second) pulses of light in the x-ray energy region. Compared with x-ray synchrotrons, XFELs have x-ray pulses of a much shorter duration and higher intensity.

Now, a team of international scientists has used one of these instruments--the Pohang Accelerator Laboratory XFEL (PAL-XFEL) in South Korea--to monitor the melting of nanometer-thick gold films made up of lots of very tiny crystals oriented in various directions. They used an ultrashort x-ray pulse ("probe") to monitor the structural changes following the excitation of these polycrystalline gold thin films by a femtosecond laser ("pump"), which induces melting. When the x-ray pulse strikes the gold, the x-ray beam gets diffracted in a pattern that is characteristic of the material's crystal structure. By collecting x-ray diffraction images at different pump-probe time delays on picosecond (one-trillionth of a second) scales, they were able to take "snapshots" as melting began and progressed in the gold thin films. Changes in the diffraction patterns over time revealed the dynamics of crystal disordering. The scientists selected gold for this study because it diffracts x-rays very strongly and has a well-defined solid-to-liquid transition.

The x-ray diffraction patterns revealed that melting is inhomogeneous (nonuniform). In a paper published online in the Jan. 17 issue of Science Advances, scientists proposed that this melting likely originates at the interfaces where crystals of different orientations meet (imperfections called grain boundaries) and then propagates into the small crystalline regions (grains). In other words, the grain boundaries start melting before the rest of the crystal.

"Scientists believed that melting in polycrystalline materials occurs preferentially at surfaces and interfaces, but before XFEL the progression of melting as a function of time was unknown," said co-corresponding author Ian Robinson, leader of the X-ray Scattering Group in the Condensed Matter Physics and Materials Science (CMPMS) Division at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory. "It was known that the laser generates "hot" (energetic) electrons, which cause melting when they transfer their energy to the crystal. The idea that this energy transfer process happens preferentially at grain boundaries and thus is not uniform has never been proposed until now."

"The mechanism of laser-induced melting is important to consider for micromachining of precision parts used in aerospace, automotive, and other industries," added first author Tadesse Assefa, a postdoc in Robinson's group. "The way the laser couples to the material is different depending on the pulse duration of the laser. For example, the ultrashort pulses of femtosecond lasers seem to be better than the longer pulses of nanosecond lasers for making clean cuts such as drilling holes."

For their experiment, the scientists first fabricated thin films of varying thickness (50, 100, and 300 nanometers) at the Center for Functional Nanomaterials (CFN)--a DOE Office of Science User Facility at Brookhaven. Here, in the CFN Nanofabrication Facility, they performed electron-beam evaporation, a deposition technique that uses electrons to condense the desired material onto a substrate. The ultraclean environment of this facility enabled them to create gold films of uniform thickness over a large sample area.

At PAL-XFEL, they conducted time-resolved x-ray diffraction on these films over a range of laser power levels. Software developed by staff in Brookhaven Lab's Computational Science Initiative handled the high-throughput analysis of the terabytes of data generated as a detector collected the diffraction pattern images. The team then used software developed by scientists at Columbia Engineering to convert these images into linear graphs.

The plots revealed a double peak corresponding to a "hot" region undergoing melting (intermediate peak) and a relatively "cold" region (the rest of the crystal) which has yet to receive the latent heat of melting. Through electron coupling, heat goes to the grain boundaries and then conducts into the grains. This uptake of latent heat results in a band of melting material sandwiched between two moving melt fronts. Over time, this band becomes larger.

"One melt front is between a solid and melting region, and the other between a melting and liquid region," explained Robinson.

Next, the team plans to confirm their two-front model by reducing the size of the grains (thereby increasing the number of grain boundaries) so they can reach the end of the melting process. Because melting occurs as a wave traversing the crystal grains at a relatively slow speed (30 meters per second), it takes longer than the timing range of the instrument (500 picoseconds) to cross big grains.

They would also like to look at other metals, alloys (mixtures of several metals or a metal combined with other elements), and catalytically relevant materials, in which grain boundaries are involved in chemical reactions.

"This study represents the very beginning of how we build an understanding of the mechanism of melting," said Assefa. "By performing these experiments using different materials, we will be able to determine if our model is generalizable."

Credit: 
DOE/Brookhaven National Laboratory

New study reveals the secret of magmatic rocks consisting of only one mineral

image: An example of a thick layer of stratiform anorthosite (white) from the world-known Bushveld Complex in South Africa.

Image: 
Wits University

Geologists from Wits University in Johannesburg, South Africa, have come up with an original explanation of how nature may produce an intriguing class of magmatic rocks that are made up of only one type of mineral.

The magmatic minerals are stored at great depth in the Earth and are delivered from there into the shallow intrusions close to the planet's surface in the form of magmas - essentially hot liquids of molten minerals. On cooling, these magmas crystallise to form rocks that are commonly composed of several types of minerals.

However, some of these magmas crystallise into rocks that consist of only one mineral. A typical example is anorthosite - a magmatic rock that is made up of only one mineral called plagioclase - a component that is currently considered to be important for glass fibre manufacturing.

Anorthosites occur as very prominent, white-coloured layers in many layered intrusions worldwide and, in particular, are common for the famous platinum-rich Bushveld Complex in South Africa - the largest basaltic magma chamber in the Earth's crust - in which these layers extend for hundreds of kilometres.

For years, geologists have been puzzling about how these remarkable layers of pure anorthosites are produced.

"There were many attempts to solve this issue involving various processes that operate within the shallow magma chambers, but they were not particularly successful," says Professor Rais Latypov from the School of Geosciences at Wits University.

However, Latypov and his team have now found an elegant solution to this long-standing petrological puzzle.

"We took a radically different approach and started searching for a mechanism to generate melts saturated in plagioclase alone outside of the shallow magma chambers," says Rais Latypov.

"We realised that some melts rising up from deep-seated magma chambers may become saturated in plagioclase alone. This happens in response to decompression as the melts ascend from the depth towards the Earth's surface." This research was published a paper in Scientific Reports.

When these magmas arrive into a shallow magma chamber and cool there, they may crystallise stratiform layers of pure plagioclase composition like the ones we observe in the Bushveld Complex.

Latypov and his team believe that their work represents a significant advance in the understanding of the Earth's magmatic systems.

"This study provides a long-missing bridge between volcanology - where we mostly deal with the generation of melts and their ascent - and igneous petrology that mainly focuses on crystallisation of these melts within magma chambers," says Latypov.

"We can now paint a much better picture of how some of Earth's valuable minerals are derived from the Earth's depth and deposited in the form of monomineralic layers in the shallow intrusions, thus making them much easier to access."

Credit: 
University of the Witwatersrand

CRISPR-HOT: A new tool to 'color' specific genes and cells

video: Through CRISPR-HOT, an important component of the cellular division process (the mitotic spindle) was visualized in green. First, a movie of cell division in a healthy liver organoid, then a movie of cell division in a liver organoid in which the cancer gene TP53 was disabled. The latter resulted in frequent abnormal divisions.

Image: 
Benedetta Artegiani, Delilah Hendriks, ©Hubrecht Institute

Researchers from the group of Hans Clevers at the Hubrecht Institute have developed a new genetic tool to label specific genes in human organoids, or mini organs. They used this new method, called CRISPR-HOT, to investigate how hepatocytes divide and how abnormal cells with too much DNA appear. By disabling the cancer gene TP53, they showed that unstructured divisions of abnormal hepatocytes were more frequent, which may contribute to cancer development. Their results were described and published in the scientific journal Nature Cell Biology.

Organoids are mini organs that can be grown in the lab. These mini-organs grow from a very small piece of tissue, and this is possible for various organs. The ability to genetically altering these organoids would help a great deal in studying biological processes and modelling diseases. So far however, the generation of genetically altered human organoids has been proven difficult due to the lack of easy genome engineering methods.

CRISPR-HOT

A few years ago, researchers discovered that CRISPR/Cas9, which acts like tiny molecular scissors, can precisely cut at a specific place in the DNA. This new technology greatly helped and simplified genetic engineering. "The little wound in the DNA can activate two different mechanisms of repair in the cells, that can both be used by researchers to coerce the cells to take up a new part of DNA, at the place of the wound" says Delilah Hendriks (Hubrecht Institute). One of these methods, called non-homologous end joining, was thought to make frequent mistakes and therefore until now not often used to insert new pieces of DNA. "Since some earlier work in mice indicated that new pieces of DNA can be inserted via non-homologous end joining, we set out to test this in human organoids" says Benedetta Artegiani (Hubrecht Institute). Artegiani and Hendriks then discovered that inserting whatever piece of DNA into human organoids through non-homologous end joining is actually more efficient and robust than the other method that has been used until now. They named their new method CRISPR-HOT.

Coloring cells

The researchers then used CRISPR-HOT to insert fluorescent labels into the DNA of human organoids, in such a way that these fluorescent labels were attached to specific genes they wanted to study. First, the researchers marked specific types of cells that are very rare in the intestine: the enteroendocrine cells. These cells produce hormones to regulate for example glucose levels, food intake, and stomach emptying. Because these cells are so rare, they are difficult to study. However, with CRISPR-HOT, the researchers easily "painted" these cells in different colors, after which they easily identified and analyzed them. Second, the researchers painted organoids derived from a specific cell type in the liver, the biliary ductal cells. Using CRISPR-HOT they visualized keratins, proteins involved in the skeleton of cells. Now that they could look at these keratins in detail and at high resolution, the researchers uncovered their organization in an ultra-structural way. These keratins also change expression when cells specialize, or differentiate. Therefore, the researchers anticipate that CRISPR-HOT may be useful to study cell fate and differentiation.

Abnormal cell division in the liver

Within the liver, there are many hepatocytes that contain two (or even more) times the DNA of a normal cell. It is unclear how these cells are formed and whether they are able to divide because of this abnormal quantity of DNA. Older adults contain more of these abnormal hepatocytes, but it is unclear if they are related to diseases such as cancer. Artegiani and Hendriks used CRISPR-HOT to label specific components of the cell division machinery in hepatocyte organoids and studied the process of cell division. Artegiani: "We saw that "normal" hepatocytes divide very orderly, always splitting into two daughter cells in a certain direction". Hendriks: "We also found several divisions in which an abnormal hepatocyte was formed. For the first time we saw how a "normal" hepatocyte turns into an abnormal one." In addition to this, the researchers studied the effects of a mutation often found in liver cancer, in the gene TP53, on abnormal cell division in hepatocytes. Without TP53 these abnormal hepatocytes were dividing much more often. This may be one of the ways that TP53 contributes to cancer development.

The researchers believe that CRISPR-HOT can be applied to many types of human organoids, to visualize any gene or cell type, and to study many developmental and disease related questions.

Credit: 
Hubrecht Institute

NASA finds ex-Tropical Cyclone Esther moving back inland

image: On Mar. 2, 2020, the MODIS instrument that flies aboard NASA's Aqua satellite provided a visible image of Esther's remnant clouds that showed the storm moved back inland and away from the coast.

Image: 
NASA Worldview

Ex-Tropical Cyclone Esther just won't give up. The storm formed in the South Pacific Ocean, tracked across Australia's Northern Territory and reached the Kimberley coast of Western Australia, and has now turned around. NASA's Aqua satellite provided forecasters with a visible image of the storm turning back into Western Australia on March 2.

On March 2, the Moderate Resolution Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite provided a visible image of Esther's remnant clouds that showed the storm moved back inland and away from the coast.

The Australian Bureau of Meteorology (ABM) issued a Flood Watch for the Tanami Desert, Central Desert, MacDonnell Ranges, Barkly, Georgina River and Simpson Desert on March 2. A flood warning is current for Sturt Creek District in Western Australia.

At 10:48 a.m. ACST on Monday, March 2, the ABM forecast said, "Rainfall is expected to increase from today with widespread daily totals of 50 - 80 mm [2 to 3.1 inches] and isolated falls of 150 mm [5.9 inches] expected for the northern Tanami Desert. Rainfall into Tuesday is expected to increase with widespread falls of 70 - 120 mm [2.8 to 4.7 inches] expected in the Central Desert and southeastern Tanami Desert. Isolated falls of 180 mm [7.0 inches] could also be possible in places.

Rainfall is expected to increase in the MacDonnell Ranges and southern Barkly during Tuesday with 40 - 100 mm [1.6 to 3.9 inches] daily totals expected into Wednesday. Rainfall extends to the upper Georgina River and Simpson Desert during Wednesday with daily rainfall totals 20 - 80 mm [0.8 to 3.14 inches] expected in many areas."

Many roads including major transportation routes in the flood watch area can expect to be affected on Mar. 2 and become impassable with some communities and homesteads becoming isolated.

Later today, ex-Tropical Cyclone Esther is expected to move into the northern Tanami District from the west as a strong arc of tropical low pressure.

Credit: 
NASA/Goddard Space Flight Center

Wake Forest scientists create world's most sophisticated lab model of the human body

WINSTON-SALEM, NC - March 2, 2020 - Scientists at the Wake Forest Institute for Regenerative Medicine (WFIRM) have developed the world's most sophisticated laboratory model of the human body, creating a system of miniaturized organs that can be used to detect harmful and adverse effects of drugs before they are prescribed to patients. Using such a system in screening potential pharmaceuticals could have a significant impact on speeding new drugs to market, lowering the cost of clinical trials, and reducing or eliminating animal testing.

The system, developed from funding provided by the Defense Threat Reduction Agency (DTRA), is built from many human cell types that are combined into human tissues representing a majority of the organs in the human body such as the heart, liver and lungs. Each of these miniature organs are tiny 3D tissue-like structures about one millionth the size of an adult human organ. The system can be used to mimic tissues/organs and can be used as a testing and predicting platform.

"The most important capability of the human organ tissue system is the ability to determine whether or not a drug is toxic to humans very early in development, and its potential use in personalized medicine," said Anthony Atala, MD, of the Wake Forest Institute for Regenerative Medicine and the study's senior author. "Weeding out problematic drugs early in the development or therapy process can literally save billions of dollars and potentially save lives."

In fact, WFIRM's miniature organ model has already been able to measure toxicity in many drugs approved for human use that were later pulled from the market when it was discovered that these drugs could actually be quite harmful to people. Although toxicity from the recalled drugs was not found initially using standard 2D cell culture systems and animal testing models, and adverse effects were not detected throughout three levels of human clinical trials, the system developed at WFIRM was able to readily detect toxicity, replicating the damage seen in humans.

In a paper published by the journal Biofabrication, the researchers detail how the miniature organs were created and how the human organ tissue system works. Because of the specified individual requirements of each type of tissue, a toolbox of biofabrication techniques was employed to create each miniaturized organ.

Tiny samples of human tissue cells are isolated and engineered into miniature versions of the human organ. They can contain blood vessel cells, immune system cells, and even fibroblasts, the cells in the connective tissue. Each of these organs, also known as organ tissue equivalents, performs the same functions that they do in the human body. For example, the heart beats about 60 times each minute, the lung breaths the air from the surrounding environment, and the liver breaks down toxic compounds into harmless waste products.

"We knew very early on that we needed to include all of the major cell types that were present in the original organ," said co-author Aleks Skardal, PhD, formerly of WFIRM and now at Ohio State University. "In order to model the body's different responses to toxic compounds, we needed to include all of the cell types that produce these responses."

Another hallmark feature of WFIRM's human organ tissue system is the blood circulatory system. Each system contains media, a substance containing nutrients and oxygen, that is circulated among all the organ types, delivering oxygen and removing waste. The blood system in these devices is very small, employing a technology known as microfluidics to recirculate test compounds through the organ system and remove the drug breakdown products that each organ is producing.

The WFIRM team recognized very early on that drugs and toxic molecules don't move neatly from one organ to the next. Rather than transfer samples from one organ type to the next, the researchers built a microfluidic circuit that recirculates samples, over and over, through each organ in exactly the same way that the heart recirculates molecules through the human body in the blood.

WFIRM's human organ tissue system was not easy to develop. The institute scientists have been working for close to three decades to build large-scale human organs for transplantation into patients. To date, more than 15 tissue and organ products/technologies developed by WFIRM scientists, including muscle, bladder and vaginal organs, have already been tested in humans in clinical trials.

"Creating microscopic human organs for drug testing was a logical extension of the work we have accomplished in building human-scale organs," said co-author Thomas Shupe, PhD, of WFIRM. "Many of the same technologies we have developed at the human-scale level, like including a very natural environment for the cells to live in, also produced excellent results when brought down to the microscopic level."

Because the WFIRM system contains the right cells, in the right numbers from the right species, the data is much more predictive of biological responses expected in normal human beings.

Credit: 
Atrium Health Wake Forest Baptist

Scientists succeed in measuring electron spin qubit without demolishing it

A group of scientists from the RIKEN Center for Emergent Matter Science in Japan have succeeded in taking repeated measurements of the spin of an electron in a silicon quantum dot (QD), without changing the spin in the process. This type of "non-demolition" measurement is important for creating quantum computers that are fault tolerant.
Quantum computers promise to make it easier to perform certain classes of calculations such as many-body problems, which are extremely difficult and time-consuming for conventional computers. Essentially, the involve measuring a quantum value which is never in a single state like a conventional transistor, but instead exists as a "superimposed state"--in the same way that Schrodinger's famous cat cannot be said to be alive or dead until it is observed. Using such systems, it is possible to conduct calculations with a qubit that is a superimposition of two values, and then determine statistically what the correct result is. Quantum computers that use single electron spins in silicon QDs are seen as attractive due to their potential scalability and because silicon is already widely used in electronics technology.

The key difficulty, however, with developing quantum computers is that they are very sensitive to external noise, making error correction critical. So far, researchers have succeeded in developing single electron spins in silicon QDs with a long information retention time and high-precision quantum operation, but quantum non-demolition measurement--a key to effective error correction--has proven elusive. The conventional method for reading out single electron spins in silicon is to convert the spins into charges that can be rapidly detected, but unfortunately, the electron spin is affected by the detection process.

Now, in research published in Nature Communications, the RIKEN team has achieved such non-demolition measurement. The key insight that allowed the group to make the advance was to use the Ising type interaction model--a model of ferromagnetism that looks at how the electron spins of neighboring atoms become aligned, leading to the formation of ferromagnetism in the entire lattice. Essentially, they were able to transfer the spin information--up or down--of an electron in a QD to another electron in the neighboring QD using the Ising type interaction in a magnetic field, and then could measure the spin of the neighbor using the conventional method, so that they could leave the original spin unaffected, and could carry out repeated and rapid measurements of the neighbor.

"Through this," explains Group Director Seigo Tarucha, who led the research group, "we were able to achieve a non-demolition fidelity rate of 99%, and by using repeated measurements would get a readout accuracy of 95%. We have also shown that theoretically, this could be increased to out 99.6%, and plan to continue work toward reaching that level."

He continues, "This is very exciting, because if we can combine our work with high-fidelity single- and two-qubit gates, which are currently being developed, we could potentially build a variety of fault-tolerant quantum information processing systems using a silicon quantum-dot platform."

Credit: 
RIKEN