Brain

The era of quantum supremacy is here

Please Note: The 2020 American Physical Society (APS) March Meeting that was to be held in Denver, Colorado from March 2 through March 6 has been canceled. The decision was made late Saturday (February 29), out of an abundance of caution and based on the latest scientific data available regarding the transmission of the coronavirus disease (COVID-19). See our official release on the cancelation for more details.

DENVER, COLO., FEBRUARY 28, 2020 -- Google made headlines in late 2019 with an experiment that demonstrated quantum supremacy for the first time. Their quantum computer, the Sycamore Processor, took a mere 200 seconds to perform a computation that would have taken a traditional computer 10,000 years. Members of the research team--Pedram Roushan, Zijun Chen, and Kevin Satzinger--will discuss this groundbreaking feat at the 2020 American Physical Society March Meeting in Denver.

Roushan will report the results of this experiment, discuss its computational complexity, and show how the team could verify the results of their quantum computation. Looking to the future, He will also demonstrate the programmability of the Sycamore Processor and how various algorithms can be implemented on it.

When the Google research team performed their experiment, they measured the outputs of the Sycamore Processor's 53 superconducting qubits simultaneously. This is one of the largest simultaneous measurements of a superconducting quantum computer ever done. Google researcher Zijun Chen will present the new strategies used by the research team to get precise results from their unprecedentedly large experiment and reduce measurement errors. Such strategies could be crucial to operating larger quantum computers..

"As far as I know, no one's ever done anything quite like this," Chen said.

Kevin Satzinger, also a Google researcher, will discuss the engineering advances of the Sycamore Processor device which allowed it to accomplish its record-breaking feat. He will also demonstrate the benchmarking methods used to evaluate its performance--which, themselves, were impossible methods to implement on a classical computer and therefore demonstrated quantum supremacy. Lastly, Satzinger will present the digital error model used by the researchers, which was verified by the experiment. This digital error model represents a step forward toward solving the problem of quantum error correction, which is the major obstacle to scaling up quantum computers.

Quantum computing scientists refer to the current period of time as the Noisy Intermediate-Scale Quantum (NISQ) era. Quantum computers of over 50 qubits have been created, entering an intermediate scale where we might expect to see evidence of quantum supremacy: a quantum computer solving a problem that a classical computer might take millennia to process. However, measurements of these quantum computers will be noisy because 50-100 qubits is not enough to implement a self-correcting algorithm that would eliminate or reduce noise.

Two presenters in the session will be discussing their research on quantum error correction in the NISQ era, when quantum computers don't have enough qubits to implement self-correction. Ramis Movassagh of IBM Research tackled a theoretically complex problem: how do you quantify the difficulty of a problem posed to a quantum computer, when that difficulty needs to include the inherent noise? Previous work suggested that the task of Random Circuit Sampling (RCS) has a certain level of difficulty for a quantum computer as the number of qubits becomes arbitrarily large. This task was run on Google’s Sycamore Processor for 53 qubits. However, the previous mathematical work left no room for noise and errors, which might decrease the problem’s difficulty. Movassagh, however, proved that RCS is still a difficult task when including some specific additive errors, thereby significantly advancing toward a proof of the quantum supremacy conjecture. This work could provide a new way of considering problem difficulty in an era of noisy quantum computers.

Gian Giacomo Guerreschi from Intel Corporation and collaborators from Carnegie Mellon University have tested a noise-robust algorithm that is a hybrid of quantum and classical computing: the Quantum Approximate Optimization Algorithm (QAOA). "Its back-and-forth between quantum and classical computing should get rid of systematic errors," says Guerreschi. Because it's a mix of two computing types, he prefers to call it a protocol rather than an algorithm, and says they are developing a way to tailor QAOA to different kinds of problems.

Sebastian Deffner, of the University of Maryland Baltimore County, will present a new theory of quantum thermodynamics at the quantum supremacy session. Since the 1960s, scientists have had a good understanding of how much energy computers require to process a given amount of information. This understanding is missing for quantum computers.

"Let's say you take a selfie and you uploaded it to Snapchat," he said. "You can calculate exactly how much energy has been drawn from the battery of your phone. For quantum computers, we don't have that yet."

Theoretically, quantum computers may be exponentially better than classical computers at solving certain problems; but that might mean they require exponentially more power to do so. But work and energy operate differently on the quantum scale, where states can be in superposition and positions can have set uncertainties. "We cannot rely on old concepts anymore, but we need something new," Deffner said.

Credit: 
American Physical Society

Children who read books daily score higher in school tests, vast new study states

What children choose to read outside school directly influences their academic performance, according to a major new study led by the University of Malaga and UCL, and published in the peer-reviewed journal Oxford Review of Education.

Using longitudinal census data to look at more than 43,000 students, aged 10 to 11 and then again when they were 13 to 14, the research provides substantial evidence that pupils who enjoy reading high-quality books daily score higher in tests.

The average marks of pupils who read books rose by 0.22 points overall, which is the equivalent of 3 months' worth of additional secondary school academic growth.

The study demonstrated no similar advantage for children's reading daily newspapers, comics or magazines, and only marginal benefits from short stories.

The findings have important implications for parents, teachers and policymakers, and the international research team is recommending that young people devote their reading time solely to books.

"Although three months' worth of progress may sound comparatively small to some people, it equates to more than 10% of the three academic secondary school years measured - from when these young people are aged 11 years old to 14, which we know is a hugely developmental period," explains co-author Professor John Jerrim, from the UCL.

"In an increasingly digital world, it's important that young people are encouraged to find time to read a good book.

"Other less complex and less engaging forms of reading are unlikely to bring the same benefits for their cognitive development, and shouldn't be counted as part of their reading time.

"This is particularly important for low-achievers, where any association is likely to be strongest."

Lead researcher Luis Alejandro Lopez-Agudo, from the University of Malaga, added: "Reading is a fundamental skill that plays a key part in all our lives.

"Our results provide further evidence that it's not only whether young people read or not that matters - but also what they read."

The amount of time children spend reading is already understood to help develop their literacy skills. This ability increases through practice and by trying longer and more challenging texts.

Few studies though have focused on whether the type of material children choose influences their achievements at school.

This study, looking at pupils in Spain, attempted to establish whether a link exists between literacy and mathematics scores and the type of material children look at in their spare time, as well as how long they spend doing this. Comics, short stories, books, newspapers and magazines were the texts included in the research.

The researchers used data from a census carried out by the Andalusian Agency of Education Assessment. This included questionnaire responses completed during 2008 to 2009 by 10 to 11-year-olds, and from those aged 13 to 14 during 2011 and 2012.

Children's attitudes towards school were considered along with prior achievement levels. Parents were also asked about their own reading habits and how involved they were in their child's education.

The results showed the more frequently children read books, the better they performed in school tests as teenagers. The same effect was not observed with comics, newspapers and magazines. Specifically, researchers found:

13 to 14-year-olds who read books every or almost every day scored 0.22 standard deviations higher (the equivalent of three months) on the literacy test than those who read books almost never.

There is evidence of positive spill-overs into other subjects, with a difference of around 0.20 standard deviations in mathematics.

There was some benefit from short stories for children who enjoyed them at least once a month. The researchers concluded though that increasing the frequency of this to weekly or daily was unlikely to bring any further benefits.

The study also highlighted the reading patterns across different groups of children. It showed:

Girls seem to read short stories, books and newspapers more frequently than boys, while the opposite holds true for comics and magazines.

Young people from advantaged backgrounds read all the text types more frequently than those from disadvantaged homes.

High-achieving students (according to their 5th grade test scores) were more likely to read tales/short novels and books compared to low-achieving students, though with little difference in terms of reading comics, newspapers and magazines.

The findings of this study should be interpreted in the context of some limitations
and the need for further research. These include the research being carried out in one particular region within Spain, and the focus upon academic progress made during the early teenage years. At this point, reading skills are already quite well-developed - there is no data for younger children.

Credit: 
Taylor & Francis Group

Metal-organic frameworks can separate gases despite the presence of water

image: A visualization of the structure of metal-organic frameworks with the metal (cobalt, blue) at the corners and the organic structures spanning the sides (carbon, gray; oxygen, red).

Image: 
Vervoorts P. et al., <em>ACS Applied Materials & Interfaces</em>, Jan. 27, 2020

Metal-organic frameworks (MOFs) are promising materials for inexpensive and less energy-intensive gas separation even in the presence of impurities such as water.

Experimental analyses of the performance of metal-organic frameworks (MOFs) for the separation of propane and propene under real-world conditions revealed that the most commonly used theory to predict the selectivity does not yield accurate estimates, and also that water as an impurity does not have a detrimental effect on the material's performance.

Short chain hydrocarbons are produced in mixtures after treatment of crude oil in refineries and need to be separated in order to be industrially useful. For example, propane is used as a fuel and propene as a raw material for chemical synthesis such as the production of polymers. However, the separation process usually requires high temperatures and pressures, and additionally the removal of other impurities such as water makes the process costly and energy-consuming.

The structure of the studied MOF offers a long-lived, adaptable, and most importantly efficient separation alternative at ambient conditions. They build on the fact that unsaturated molecules such as propene can be complexed with the material's exposed metal atoms, while saturated ones such as propane fail to do so. While research has focused on developing different metal-organic frameworks for different separation processes, the feasibility of using these materials on industrial-scale applications is commonly only gauged by relying on a theory that makes many idealizing assumptions on both the material and the purity of the gasses. Thus, it has not been clear whether these predictions hold under more complicated but also more realistic conditions.

A team of Hokkaido University researchers around Professor Shin-ichiro Noro in collaboration with Professor Roland A. Fischer's group at the Technical University of Munich conducted a series of measurements on the performance of a prototypical MOF to ascertain the material's real-world selectivity, for both completely dry frameworks and ones pre-exposed to water.

Their results recently published in ACS Applied Materials & Interfaces show that the predicted selectivities of the material are too high compared to the real-world results. It also demonstrated that water does not drastically decrease the selectivity, although it does reduce the material's capacity to adsorb gas. The team then performed quantum-chemical computations to understand why and realized that the water molecules themselves offer new binding sites to unsaturated hydrocarbons, such as propene (but not propane), thus retaining the material's functionality.

The researchers state: "We showed the power of multi-component adsorption experiments to analyze the feasibility of using an MOF system." They thus want to raise awareness of the shortcomings of commonly used theories and motivate other groups to also use a combination of different real-world measurements.

Credit: 
Hokkaido University

Overlooked arch in the foot is key to its evolution and function

New Haven, Conn. --A long-overlooked part of the human foot is key to how the foot works, how it evolved, and how we walk and run, a Yale-led team of researchers said.

The discovery upends nearly a century of conventional thinking about the human foot and could open new avenues to explore in evolutionary biology as well as guide new designs for robotic and prosthetic feet, said the study team.

The discovery, made by an international team of researchers and led by Yale engineer Madhusudhan Venkadesan, was published Feb. 26 in the journal Nature. The team was led jointly by Venkadesan, Shreyas Mandre from the University of Warwick, and Mahesh Bandi from the Okinawa Institute of Science & Technology (OIST).

When humans walk and run, the front of each foot repeatedly pushes on the ground with a force exceeding several times the body's weight. Despite these strong forces, the human foot maintains its shape without severely bending. Such stiff feet -- unique to humans among primates -- were important for the evolution of bipedalism.

What makes human feet so stiff? According to conventional thinking, it's mainly the longitudinal arch of the foot. This arch runs from heel to forefoot and is reinforced by elastic tissues underneath it. The arch and tissues create a bow-and-string structure that for nearly a century was considered the main source of the foot's stiffness.

But the foot has a second arch that runs across the width of the midfoot, known as the transverse arch. Venkadesan and his colleagues investigated the transverse arch, which had not been studied previously. They performed a series of experiments,using mechanical mimics of the foot, cadaveric human feet, and fossil samples from long-extinct human ancestors and relatives (hominins). Their results show that the transverse arch is the main source of the foot's stiffness.

The reason the transverse arch is so important can be found in your wallet. Take out a dollar bill, hold it at one end, and the dollar flops around. But press your thumb down to give the dollar some curvature, and it stands out straight.

"That type of effect also works in the foot," said Venkadesan, assistant professor of mechanical engineering and materials science. "It's not as simple as a sheet of paper because there are many other tissues and structures in the foot, but the principle turns out to be the same."

Using mathematical analysis and experiments, they gleaned the mechanical principle for why curvature induces stiffness -- namely that bending a curved structure causes the material to also stretch. Even a thin sheet of paper is quite stiff if you try to stretch it. The transverse curvature engages this stretching stiffness to stiffen the whole structure, explained the researchers.

Because the foot is a complicated, multi-functional structure, it is not possible to modify just the transverse arch to test the theory without affecting other parts. So, using experiments on mechanical mimics of the foot, the researchers came up with a novel idea to see whether the transverse arch works the same way in real human feet.

"We found that transverse springs, which mimic tissues spanning the width of your foot, are crucial for curvature-induced stiffness," said Ali Yawar, a Ph.D. student in Venkadesan's lab. "So we expected that stiffness would decrease in real human feet if we were to remove the transverse tissues and leave everything else untouched."

Together with Steven Tommasini, a research scientist at the Yale School of Medicine, they conducted experiments on the feet of human cadavers.

"We found that the transverse arch, acting through the transverse tissues, is responsible for nearly half of the foot's stiffness, considerably more than what the longitudinal arch contributes," said Carolyn Eng, an associate research scientist in Venkadesan's lab.

These results may also explain how the 3.66 million-year-oldAustralopithecus afarensis, the same species as the fossil Lucy, could have walked and left a human-like footprint despite having no apparent longitudinal arch. Working with Andrew Haims, a professor at the Yale School of Medicine, the researchers developed a new technique to measure transverse curvature using partial skeletons of the foot. By applying this technique to fossil samples, includingA. afarensis, they traced how the transverse arch evolved among early hominins.

"Our evidence suggests that a human-like transverse arch may have evolved over 3.5 million years ago, a whole 1.5 million years before the emergence of the genus Homo, and was a key step in the evolution of modern humans," Venkadesan said.

The findings also open new lines of thought for podiatry, as well as the fields of evolutionary biology and robotics, the researchers said.

Credit: 
Yale University

Mystery surrounding dinosaur footprints on a cave ceiling in Central Queensland solved

image: Ross Staines measuring the footprints 4.5 metres above the cave floor (c. 1954).

Image: 
Copyright STAINES

The mystery surrounding dinosaur footprints on a cave ceiling in Central Queensland has been solved, in article published in Historical Biology, after more than a half a century.

University of Queensland palaeontologist Dr Anthony Romilio discovered pieces to a decades-old puzzle in an unusual place - a cupboard under the stairs of a suburban Sydney home.

"The town of Mount Morgan near Rockhampton has hundreds of fossil footprints and has the highest dinosaur track diversity for the entire eastern half of Australia," Dr Romilio said.

"Earlier examinations of the ceiling footprints suggested some very curious dinosaur behaviour; that a carnivorous theropod walked on all four legs.

"You don't assume T. rex used its arms to walk, and we didn't expect one of its earlier predatory relatives of 200 million years ago did either."

Researchers wanted to determine if this dinosaur did move using its feet and arms, but found accessing research material was difficult.

"For a decade the Mount Morgan track site has been closed, and the published 1950s photographs don't show all the five tracks," Dr Romilio said.

However Dr Romilio had a chance meeting with local dentist Dr Roslyn Dick, whose father found many dinosaur fossils over the years.

"I'm sure Anthony didn't believe me until I mentioned my father's name - Ross Staines," Ms Dick said.

"Our father was a geologist and reported on the Mount Morgan caves containing the dinosaur tracks in 1954.

"Besides his published account, he had high-resolution photographs and detailed notebooks, and my sisters and I had kept it all.

"We even have his dinosaur footprint plaster cast stored under my sister's Harry Potter cupboard in Sydney."

Dr Romilio said the wealth and condition of 'dinosaur information' archived by Ms Dick and her sisters Heather Skinner and Janice Millar was amazing.

"I've digitised the analogue photos and made a virtual 3D model of the dinosaur footprint, and left the material back to the family's care," he said.

"In combination with our current understanding of dinosaurs, it told a pretty clear-cut story."

The team firstly concluded that all five tracks were foot impressions - that none were dinosaur handprints.

Also the splayed toes and moderately long middle digit of the footprints resembled two-legged herbivorous dinosaur tracks, differing from prints made by theropods.

"Rather than one dinosaur walking on four legs, it seems as though we got two dinosaurs for the price of one - both plant-eaters that walked bipedally along the shore of an ancient lake," Dr Romilio said.

"The tracks lining the cave-ceiling were not made by dinosaurs hanging up-side-down, instead the dinosaurs walked on the lake sediment and these imprints were covered in sand.

"In the Mount Morgan caves, the softer lake sediment eroded away and left the harder sandstone in-fills."

Credit: 
Taylor & Francis Group

Explained: Why water droplets 'bounce off the walls'

image: An image showing the water drop bounce

Image: 
university of Warwick

When a water droplet lands on a surface it can splash, coat the surface cleanly, or in special conditions bounce off like a beach ball

Droplets only bounce when the speed of collision with a surface is just right, creating a very thin nanoscale air cushion for it to rebound off

Drop collision is integral to technology such as 3D printing and spray cooling of next-generation electronics, and understanding this can help future developments in these fields

University of Warwick researchers can now explain why some water droplets bounce like a beach ball off surfaces, without ever actually touching them. Now the design and engineering of future droplet technologies can be made more precise and efficient.

Collisions between liquid drops and surfaces, or other drops, happen all the time. For example, small water drops in clouds collide with each other to form larger drops, which can eventually fall and impact on a solid, like your car windscreen.

Drops can behave differently after the point of collision, some make a splash, some coat the surface cleanly, and some can even bounce like a beach ball.

In the article, published today in Physical Review Letters, researchers from the University of Warwick have found an explanation for experimental observations that some droplets bounce.

Remarkably, the fate of the drop is determined by the behaviour of a tiny cushion of air whose height can reach the scale of nanometres. To get a sense of scale, think of something the size of the moon bouncing from a garden trampoline.

Even if the surface is perfectly smooth, like in laboratory conditions, an affinity between drop molecules and the wall molecules (known as van der Waals attraction), will mean that in most cases the drop will be pinched down onto the surface, preventing it from bouncing.

The research reveals, through highly detailed numerical simulations, that for a droplet to bounce the speed of collision must be just right. Too fast, and the momentum of the drop flattens the air cushion too thinly. Too slow, and it gives the van der Waals attraction time to take hold. At the perfect speed, though, the drop can perform a clean bounce, like a high jumper just clearing the bar.

Professor Duncan Lockerby from the School of Engineering at the University of Warwick comments:

"Drop collision is integral to technology we rely upon today, for example, in inkjet printing and internal combustion engines. Understanding better what happens to colliding droplets can also help the development of emerging technologies, such as 3D printing in metal, as their accuracy and efficiency will ultimately depend on what happens to drops post collision."

Dr James Sprittles from the Mathematics Institute at the University of Warwick adds:

"Importantly, the air cushion is so thin that molecules will often never encounter one another when crossing it, akin to the emptiness of outer space, and conventional theories fail to account for this. The new modelling approach we've developed will now have applications to droplet-based phenomena ranging from cloud physics for climate science through to spray cooling for next generation electronics."

Credit: 
University of Warwick

CT provides best diagnosis for COVID-19

OAK BROOK, Ill. (February 26, 2020) - In a study of more than 1,000 patients published in the journal Radiology, chest CT outperformed lab testing in the diagnosis of 2019 novel coronavirus disease (COVID-19). The researchers concluded that CT should be used as the primary screening tool for COVID-19.

In the absence of specific therapeutic drugs or vaccines for COVID-19, it is essential to detect the disease at an early stage and immediately isolate an infected patient from the healthy population.

According to the latest guidelines published by the Chinese government, the diagnosis of COVID-19 must be confirmed by reverse-transcription polymerase chain reaction (RT-PCR) or gene sequencing for respiratory or blood specimens, as the key indicator for hospitalization. However, with limitations of sample collection and transportation, as well as kit performance, the total positive rate of RT-PCR for throat swab samples has been reported to be about 30% to 60% at initial presentation.

In the current public health emergency, the low sensitivity of RT-PCR implies that a large number of COVID-19 patients won't be identified quickly and may not receive appropriate treatment. In addition, given the highly contagious nature of the virus, they carry a risk of infecting a larger population.

"Early diagnosis of COVID-19 is crucial for disease treatment and control. Compared to RT-PCR, chest CT imaging may be a more reliable, practical and rapid method to diagnose and assess COVID-19, especially in the epidemic area," the authors wrote.

Chest CT, a routine imaging tool for pneumonia diagnosis, is fast and relatively easy to perform. Recent research found that the sensitivity of CT for COVID-19 infection was 98% compared to RT-PCR sensitivity of 71%.

For the current study, researchers at Tongji Hospital in Wuhan, China, set out to investigate the diagnostic value and consistency of chest CT imaging in comparison to RT-PCR assay in COVID-19.

Included in the study were 1,014 patients who underwent both chest CT and RT-PCR tests between January 6 and February 6, 2020. With RT-PCR as reference standard, the performance of chest CT in diagnosing COVID-19 was assessed. For patients with multiple RT-PCR assays, the dynamic conversion of RT-PCR test results (negative to positive, and positive to negative, respectively) was also analyzed as compared with serial chest CT scans.

The results showed that 601 patients (59%) had positive RT-PCR results, and 888 (88%) had positive chest CT scans. The sensitivity of chest CT in suggesting COVID-19 was 97%, based on positive RT-PCR results. In patients with negative RT-PCR results, 75% (308 of 413 patients) had positive chest CT findings. Of these, 48% were considered as highly likely cases, with 33% as probable cases. By analysis of serial RT-PCR assays and CT scans, the interval between the initial negative to positive RT-PCR results was 4 to 8 days.

"About 81% of the patients with negative RT-PCR results but positive chest CT scans were re-classified as highly likely or probable cases with COVID-19, by the comprehensive analysis of clinical symptoms, typical CT manifestations and dynamic CT follow-ups," the authors wrote.

Credit: 
Radiological Society of North America

Small precipitates make big difference in mitigating strength-ductility tradeoff

image: Nucleation of screw dislocations at nanoprecipitate-matrix interface.

Image: 
PENG Shenyou

Researchers from the Institute of Mechanics of the Chinese Academy of Sciences, teaming up with scientists from Singapore and the U.S., have found that nanoscale precipitates provide a unique sustainable dislocation source at sufficiently high stress.

The scientists discovered that densely dispersed nanoprecipitates simultaneously serve as dislocation sources and obstacles, leading to a sustainable and self-hardening deformation mechanism that enhances ductility and strength. The results were published in PNAS on Feb. 24, 2020.

In structural materials, resistance against deformation is of vital importance, specifically resistance to the initiation of plastic deformation or yielding; the stress at this point represents the strength of the material. Meanwhile, how much a metal can deform plastically - its ductility - is another important measure. Both strength and ductility depend on the movement of metal dislocations.

Movement of a dislocation becomes more difficult if some barrier or discontinuity enters the path of the dislocation, that is, the materials are hardened. Among many hardening routines, precipitate hardening has been well established and widely employed in engineering materials like Al alloys, Ni super alloys, steel, and recently discovered high-entropy alloys.

Precipitates serve as obstacles to dislocation glide and cause hardening of the material. However, they may lead to premature failure and decreased ductility. Obstacles to dislocation glide often lead to high stress concentration and even microcracks, a cause of progressive strain localization and the origin of the strength-ductility conflict.

According to the researchers, the key to mitigating the conventional strength-ductility tradeoff is to employ a mild yet homogeneous hardening mechanism at a high stress level. Nanoprecipitates provide a sustainable and self-hardening deformation mechanism that enhances ductility and strength. The condition for achieving sustainable dislocation nucleation from a nanoprecipitate is governed by the lattice mismatch between the precipitate and matrix.

Dr. PENG Shenyou, author of the study, said "The interplay of the two length scales, precipitate size and spacing, can be utilized as an optimal design motif to produce a superb combination of strength and ductility, as well as provide a criterion for selecting precipitate size and spacing in material design."

These findings establish a foundation for strength-ductility optimization through densely dispersed nanoprecipitates in multiple-element alloy systems.

Credit: 
Chinese Academy of Sciences Headquarters

Weight-based bullying linked to increased adolescent alcohol, marijuana us

Adolescents who are bullied about their weight or body shape may be more likely to use alcohol or marijuana than those who are not bullied, according to new research published by the American Psychological Association.

The link between appearance-related teasing and substance use was strongest among overweight girls, raising special concerns about this group.

"This type of bullying is incredibly common and has many negative effects for adolescents," said lead study author Melanie Klinck, BA, a clinical research assistant at the University of Connecticut. "The combination of appearance-related teasing and the increased sensitivity to body image during adolescence may create a heightened risk for substance use."

"These findings raise larger issues about how society places too much emphasis on beauty and body image for girls and women and the damaging effects that may result," said Christine McCauley Ohannessian, PhD, professor of pediatrics and psychiatry at the University of Connecticut School of Medicine, as well as director of the Center for Behavioral Health at Connecticut Children's Medical Center and a study co-author.

"Schools and communities should specifically address appearance-related teasing in anti-bullying policies and substance-use interventions," she said. "Parents particularly have a role to play in addressing this issue. There is some startling research showing that some of the most hurtful examples of weight-based teasing come from parents or siblings, so families should be kind when they discuss the weight of their children."

The study, which was conducted at Connecticut Children's Medical Center, involved a survey of 1,344 students ages 11 to 14 from five public middle schools near Hartford, Connecticut. The students were asked if siblings, parents or peers had teased them about their weight, body shape or eating during the prior six months. More than half (55%) of the overall participants reported weight-based teasing, including three out of four overweight girls (76%), 71% of overweight boys, 52% of girls who weren't overweight, and 43% of boys who weren't overweight.

The participants also were asked about their alcohol and marijuana use. The results showed that frequent weight-based teasing was associated with higher levels of total alcohol use, binge drinking and marijuana use. In a follow-up survey six months later, weight-based teasing was still linked to total alcohol use and binge drinking. The research was published online in Psychology of Addictive Behaviors.

Previous research has found that boys have greater substance use in their teens and early adulthood, but girls begin using alcohol and drugs at an earlier age compared with boys. Those trends may be related to the societal pressures for girls to adhere to unrealistic body image ideals. This can damage their sense of self-worth and contribute to eating disorders and self-medication through substance use to cope with teasing or fit in with their peers, Klinck said.

"The old saying that 'sticks and stones may break my bones but words will never hurt me' is a fallacy that ignores the serious effects of emotional abuse and verbal bullying," Klinck said. "Weight-based discrimination appears to be one of the most common and seemingly socially sanctioned reasons to bully or discriminate against someone. As a society, we need to address the damage caused by this, especially for girls."

Credit: 
American Psychological Association

A better starting point for exploring entanglement

Quantum entanglement is perhaps one of the most intriguing phenomena known to physics. It describes how the fates of multiple particles can become entwined, even when separated by vast distances. Importantly, the probability distributions needed to define the quantum states of these particles deviate from the bell-shaped, or 'Gaussian' curves which underly many natural processes. Non-Gaussian curves don't apply to quantum systems alone, however. They can also be composed of mixtures of regular Gaussian curves, producing difficulties for physicists studying quantum entanglement. In new research published in EPJ D, Shao-Hua Xiang and colleagues at Huaihua University in China propose a solution to this problem. They suggest an updated set of equations which allows physicists to easily check whether or not a non-Gaussian state is genuinely quantum.

As physicists make more discoveries about the nature of quantum entanglement, they are rapidly making progress towards advanced applications in the fields of quantum communication and computation. The approach taken in this study could prove to speed up the pace of these advances. Xiang and colleagues acknowledge that while all previous efforts to distinguish between both types of non-Gaussian curve have had some success, their choices of Gaussian curves as a starting point have so far meant that no one approach has yet proven to be completely effective. Based on the argument that there can't be any truly reliable Gaussian reference for any genuinely quantum non-Gaussian state, the researchers present a new theoretical framework.

In their approach, Xiang's team encoded non-Gaussian characteristics into the mathematics of 'Wigner' distribution functions, which are related to the probability distributions of quantum particles. Their updated equations removed many of the complications typically involved with determining non-Gaussian curves from Gaussian reference points; greatly simplifying the calculations involved. If their techniques become widely accepted, they could enable researchers to more effectively study and exploit one of the most mysterious phenomena known to physics.

Credit: 
Springer

Chronic inflammation in pregnancy linked to childhood neurodevelopmental delays

Philadelphia, February 25, 2020 - In pregnant women, obesity, diabetes, hypertension, depression and anxiety can increase the chances of learning delays, behavior problems and mental health issues in their children's early years. A new study reported in the journal Biological Psychiatry, published by Elsevier, strengthens evidence that chronic low-grade inflammation, common to these maternal conditions, may be partly to blame for the higher risk of childhood neurodevelopmental delays.

Researchers have long suspected chronic maternal inflammation may play a role in altering neurodevelopmental trajectories, leading to adverse childhood outcomes. Earlier studies, involving animals, have implicated maternal inflammation as a mechanism causing neurodevelopmental delays in offspring.

"Our findings suggest a potential therapeutic strategy to reduce prenatal exposure to inflammation and improve childhood neurodevelopment outcomes," said first author Polina Girchenko, PhD, an epidemiologist and postdoctoral researcher in the Department of Psychology and Logopedics at University of Helsinki, Finland.

To investigate further, Dr. Girchenko and her colleagues analyzed data of 418 pregnant women and their children aged between 7-to-11-years old in Southern and Eastern Finland. The women's data came from a study called PREDO, which is designed to predict and prevent preeclampsia during pregnancy, so there was a large prevalence of risk factors, including obesity, gestational diabetes, and hypertension. The team evaluated two maternal inflammatory biomarkers taken at three timepoints in the pregnancy. Maternal depression and anxiety diagnoses were extracted from Finland's national health registry.

For the children, the research team cast a wider net, using medical records and mothers' reports. Developmental delays were defined based on maternal reports and diagnoses extracted from Finland's national medical registry and included delays in cognitive, motor and social development.

Results revealed that prenatal exposure to at least one of the maternal metabolic conditions or mental health adversities was associated with a two-fold higher risk of more areas of childhood neurodevelopmental delays and was also linked to persistently high levels of antenatal inflammation. Prenatal exposure to higher levels of two maternal inflammatory biomarkers also increased a child's risk of neurodevelopmental delays. The two biomarkers combined predicted childhood neurodevelopmental delay more precisely than one alone.

"This study highlights that some potentially modifiable prenatal factors may increase the negative impact of adverse environments upon brain and behavior during childhood," said John Krystal, MD, Editor of Biological Psychiatry.

Dr. Girchenko added, "For women who are at risk, we think antenatal intervention may provide targeted prevention, such as dietary supplements associated with reduced inflammation. It's an avenue for future studies to determine the most effective interventions. At this stage, we've opened the door for further discoveries in the field."

Intervention trials are needed to see how women and children respond to different interventions. The study also raises new questions about more specific maternal conditions and various childhood outcomes, Dr. Girchenko concluded. Understanding these risk factors can help researchers devise and evaluate interventions to promote a healthy start to life.

Credit: 
Elsevier

Synthesizing a superatom: Opening doors to their use as substitutes for elemental atoms

image: Based on the dendrimer template method, clusters consisting of 3, 12, 13 and other number of atoms have been synthesized. The halogen-like superatomic nature of Ga13 was structurally and electro-chemically observed as completely different to the other clusters. The results for these gallium clusters have provided candidates for superatoms.

Image: 
Tokyo Tech

Superatom is a name given to a cluster of atoms that seem to exhibit properties similar to elemental atoms. Scientists have shown particular interest in superatomic structures, since they can be linked with atoms to produce molecules, and potentially be used to substitute certain elements in many applications.

But for superatoms to be effectively utilized, they must be specially tailored to resemble the characteristics of the corresponding elements. This transformation depends on the specific combination of electrons used. For example, when an aluminum atom with 3 valence electrons (outer shell electrons that can contribute to the formation of chemical bonds) is added to the superatom of aluminium-13, the properties change to those of a superatom of aluminium-14. Due to this modifiability of superatoms, investigating them and understanding them further is important. But previous research has been mainly theoretical, and largely focused on single clusters. Research has also not been able to synthesize superatomic clusters with sufficient volume and stability for practical application.

In a recent study, scientists from Tokyo Tech and ERATO Japan Science and Technology, led by Dr Tetsuya Kambe and Prof Kimihisa Yamamoto, fabricated clusters of the element gallium (Ga) in solution to demonstrate the effects of changing the number of atoms in a cluster on the properties of the cluster. The team synthesized Ga clusters of 3, 12, 13 and other numbers of atoms using a specialized superatom synthesizer. To characterize and analyze the structural differences among the synthesized clusters, transmission electron microscopic images were captured and calculations were performed using computation tools.

The mass spectrometry revealed that the 13- and 3-atom clusters had superatomic periodicity. The 13-atom cluster differed from the other clusters structurally and electrochemically. But the 3-atom cluster with hydrogen (Ga3H2) was reduced to Ga3H2- and was not detected, suggesting a low stability of this cluster when synthesized in the solution medium.

The ability to alter the clusters reinforces the concept that structural change can be induced in superatoms. Describing the implications of their findings, the scientists explain: "These series of results demonstrate that it is possible to change the valence electrons in superatomic clusters in solution by controlling the number of constituent atoms. This in turn enables the designing and preparation of superatoms."

This study paves the way for future research to investigate the use of superatoms as substitutes for elements. As Dr Kambe, Prof Yamamoto and team reiterate, "the superatom reveals an attractive strategy for creating new building blocks through the use of cluster structures."

Credit: 
Tokyo Institute of Technology

By gum! Scientists find new 110-million-year-old treasure

image: Gum within the fossil leaves.

Image: 
University of Portsmouth

A remarkable new treasure has been found by scientists from the University of Portsmouth - the first fossil plant gum on record. The beautiful, amber-like material has been discovered in 110 million year old fossilised leaves.

University of Portsmouth PhD student Emily Roberts, made the discovery while examining fossilised leaves of the Welwitschiophyllum plant, found in the Crato Formation in Brazil. Emily noticed thin amber-coloured bands locked inside some of the fossilised leaves she was studying.

What makes this new 'gem' unique is that unlike amber, which is made from fossilised plant resin, this substance is made from fossilised plant gum. Until now, it has been assumed that plant gums cannot survive the fossilisation processes. Their water soluble properties have meant that scientists have always assumed that gum would be dissolved in water, and could not have survived long enough to be preserved in fossil plant remains. As this fossilised gum looks so like amber, it is thought that there may be many other amber-coloured substances in fossil plants, wrongly interpreted without chemical confirmation.

Plants produce fluids such as resins and gums, which have different functions within the plant. Resins are a response to wounding and act as a defence against disease and insects. Gums are involved in food storage, structural support and wound sealing. Although gums and resins look similar, they are chemically different and gums are well known to dissolve in water. Previously, only fossilised plant resins (ambers) have been reported.

Emily said: "This new discovery overturns the basic assumption that plant gums cannot be preserved in the fossil record. It has opened our eyes to the fact that other plant chemicals may also be preserved - we can no longer just make assumptions. When we first tested the gum I was astonished that we were confirming something that was thought to be impossible - it just goes to show that fossil plants can surprise us."

This study, published in the journal Scientific Reports has also revealed another significant finding - the Welwitschiophyllum plant is considered to be related to one of the oldest and most enigmatic plants in existence. Remarkably, a considered relative of this plant is still growing today, Welwitschia is the sole survivor of this lineage and is now found only in the Namib Desert in Namibia and Southern Angola.

Co-author Professor David Martill, of the School of the Environmental Geography and Geosciences at the University of Portsmouth, said: "Emily has not only discovered something ground-breaking about plant gum, but perhaps even more astonishing her findings confirm that the Welwitschia plant found in Africa today produces a gum similar to a plant growing 110 million years ago in Brazil. Welwitschia is one of life's survivors, thriving in one of the harshest environments on earth for over 120 million years. This discovery is extremely exciting, especially when put into the context of these two continents of Africa and South America, being one during the Cretaceous period."

Researchers suggest there is still much to be learnt and that future work should focus on how this preserved gum has survived 110 million years.

This research was a collaboration between the University of Portsmouth, the University of Vienna (with amber expert Dr Leyla Seyfullah) and the British Library (with FTIR (Fourier Transform Infrared Spectroscopy) specialist Dr Paul Garside).

Credit: 
University of Portsmouth

New sandboxing approach in web browser increases security

image: Hovav Schacham, professor of computer science at the University of Texas at Austin, and his colleagues have developed a new way to secure web browsers that uses an existing tool called WebAssembly to shift some of the browser code into "secure sandboxes" that prevent malicious code from taking over the user's computer.

Image: 
University of Texas at Austin

A powerful new approach to securing web browsers, using a tool called WebAssembly, is getting its first real-world application in the Firefox browser. Developed by a team of researchers from The University of Texas at Austin, the University of California San Diego, Stanford University and Mozilla, the approach shifts some of the browser code into "secure sandboxes" that prevent malicious code from taking over the user's computer.

The new approach is now part of a test release of the Firefox browser for the Linux operating system and could be available on Windows and MacOS platforms within a few months.

Web browsers use libraries of code to do common activities -- such as rendering media files including photos, videos and audio -- but these libraries often have unreported bugs that can be exploited by hackers to take control of a computer.

"Modern browsers are the nightmare scenario for security," said Hovav Shacham, professor of computer science at UT Austin and co-author of a related paper accepted for presentation at a computer security conference to be held this August. "They have every feature imaginable. The more features you have, the more bugs there are. And the more bugs there are, the more chances an attacker has to compromise people's devices. Attackers love attacking browsers, and they really understand how to do it."

To prevent hackers from exploiting these vulnerabilities, the researchers are adapting WebAssembly, a security mechanism originally designed to speed up web applications that run within a browser while keeping those applications within "secure sandboxes" that prevent malicious code from taking over the user's computer. Applications that take advantage of WebAssembly include games and apps that perform music streaming, video editing, encryption and image recognition. In the researchers' new approach, some of the browser's own internal components -- those responsible for the decoding of media files -- would be shifted into WebAssembly sandboxes.

The researchers' approach, called the RLBox framework, is described in a paper ("Retrofitting Fine Grain Isolation in the Firefox Renderer") that will be presented at the USENIX Security Symposium in August. The papers' first author is UC San Diego Computer Science and Engineering Department graduate student Shravan Narayan, and the lead author is UC San Diego assistant professor Deian Stefan.

The new approach will initially be applied to a test version of Firefox for the Linux operating system and will secure just one rendering library used for certain fonts. Assuming the initial tests go well, the team expects the approach will be gradually expanded to include stable, full release versions of the browser on all major operating systems. They also anticipate future expansion will include other components involved in rendering media files.

"If the initial tests go well, then Firefox could apply this to all the image, video and audio formats that the browser supports," Shacham said. "The hope is that at some point, bugs in all of those libraries become useless for hacking Firefox. And if that happens, then user security would be greatly improved."

Over time, as more parts of the browser get these improvements and are incorporated into versions on more operating systems, it could improve security for millions of users worldwide. There are roughly 250 million monthly active users of the Firefox browser on desktop computers.

"Defects happen," said Eric Rescorla, Firefox CTO at Mozilla. "To keep our users secure on the internet, we need to ensure that a single programming error cannot easily compromise the browser. To date the industry's approach to this problem has been very coarse-grained, which limits its effectiveness. We're very excited to bring the new level of isolation provided by RLBox to our users."

Credit: 
University of Texas at Austin

Want to catch a photon? Start by silencing the sun

image: Even with a mesh screen covering an object, (top), Stevens quantum 3D imaging technique that generates images 40,000x clearer (middle) than current technologies (bottom).
The technology is the first real-world demonstration of single-photon noise reduction using a method called Quantum Parametric Mode Sorting, or QPMS, which was first proposed by Huang and his team in a 2017 Nature paper. Unlike most noise-filtering tools, which rely on software-based post-processing to clean up noisy images, QPMS checks light's quantum signatures through exotic nonlinear optics to create an exponentially cleaner image at the level of the sensor itself.

Detecting a specific information-bearing photon amid the roar of background noise is like trying to pluck a single snowflake from a blizzard -- but that's exactly what Huang's team has managed to do. Huang and colleagues describe a method for imprinting specific quantum properties onto an outgoing pulse of laser light, and then filtering incoming light so that only photons with matching quantum properties are registered by the sensor.

The result: an imaging system that is incredibly sensitive to photons returning from its target, but that ignores virtually all unwanted noisy photons. The team's approach yields sharp 3D images even when every signal-carrying photon is drowned out by 34 times as many noisy photons.

"By cleaning up initial photon detection, we're pushing the limits of accurate 3D imaging in a noisy environment," said Patrick Rehain, a Stevens doctoral candidate and the study's lead author. "We've shown that we can reduce the amount of noise about 40,000 times better than the top current imaging technologies."

That hardware-based approach could facilitate the use of LIDAR in noisy settings where computationally intensive post-processing isn't possible. The technology could also be combined with software-based noise reduction to yield even better results. "We aren't trying to compete with computational approaches -- we're giving them new platforms to work in," Rehain said.

Image: 
Stevens Institute of Technology

Researchers at Stevens Institute of Technology have created a 3D imaging system that uses light's quantum properties to create images 40,000 times crisper than current technologies, paving the way for never-before seen LIDAR sensing and detection in self-driving cars, satellite mapping systems, deep-space communications and medical imaging of the human retina.

The work, led by Yuping Huang, director of the Center for Quantum Science and Engineering at Stevens, addresses a decades old problem with LIDAR, which fires lasers at distant targets, then detects the reflected light. While light detectors used in these systems are sensitive enough to create detailed images from just a few photons - miniscule particles of light that can be encoded with information - it's tough to differentiate reflected fragments of laser light from brighter background light such as sunbeams.

"The more sensitive our sensors get, the more sensitive they become to background noise," said Huang, whose work appears in the Feb. 17 advanced online issue of Nature Communications. "That's the problem we're now trying to solve."

The technology is the first real-world demonstration of single-photon noise reduction using a method called Quantum Parametric Mode Sorting, or QPMS, which was first proposed by Huang and his team in a 2017 Nature paper. Unlike most noise-filtering tools, which rely on software-based post-processing to clean up noisy images, QPMS checks light's quantum signatures through exotic nonlinear optics to create an exponentially cleaner image at the level of the sensor itself.

Detecting a specific information-bearing photon amid the roar of background noise is like trying to pluck a single snowflake from a blizzard -- but that's exactly what Huang's team has managed to do. Huang and colleagues describe a method for imprinting specific quantum properties onto an outgoing pulse of laser light, and then filtering incoming light so that only photons with matching quantum properties are registered by the sensor.

The result: an imaging system that is incredibly sensitive to photons returning from its target, but that ignores virtually all unwanted noisy photons. The team's approach yields sharp 3D images even when every signal-carrying photon is drowned out by 34 times as many noisy photons.

"By cleaning up initial photon detection, we're pushing the limits of accurate 3D imaging in a noisy environment," said Patrick Rehain, a Stevens doctoral candidate and the study's lead author. "We've shown that we can reduce the amount of noise about 40,000 times better than the top current imaging technologies."

That hardware-based approach could facilitate the use of LIDAR in noisy settings where computationally intensive post-processing isn't possible. The technology could also be combined with software-based noise reduction to yield even better results. "We aren't trying to compete with computational approaches -- we're giving them new platforms to work in," Rehain said.

In practical terms, QPMS noise reduction could allow LIDAR to be used to generate accurate, detailed 3D images at ranges of up to 30 kilometers. It could also be used for deep-space communication, where the sun's harsh glare would ordinarily drown out distant laser pulses.

Perhaps most excitingly, the technology could also give researchers a closer look at the most sensitive parts of the human body. By enabling virtually noise-free single-photon imaging, the Stevens imaging system will help researchers create crisp, highly detailed images of the human retina using almost invisibly faint laser beams that won't damage the eye's sensitive tissues.

"The single-photon imaging field is booming," said Huang. "But it's been a long time since we've seen such a big step forward in noise reduction, and the benefits it could impart to so many technologies."

Credit: 
Stevens Institute of Technology