Culture

Identification of bovine IVF embryos without chromosome abnormalities by live-cell imaging

video: Live-cell imaging of bovine IVF embryo for 8 days.

Image: 
Tatsuma Yao, Kindai University

In vitro fertilized (IVF) embryo transfer has become an important innovation in the agricultural sectors, such as in cattle production. Approximately half of all bovine embryos produced worldwide were derived from IVF. However, the pregnancy success rate of IVF embryos transplanted into recipients remains low. To increase the success of pregnancy, key technological issues affecting the in vitro production of embryos and the assessment of viable embryos must be addressed. This is also true for human artificial reproductive technology (ART). Generally, assessment of bovine embryo quality is performed by morphological grading on days 7 to 8 post-insemination, as recommended by the International Embryo Technology Society (IETS), but the pregnancy success rate of embryos judged as transferable is only 30% - 50%. Therefore, novel technology for noninvasively and reliably selecting viable IVF embryos has been craving. Research groups of Dr. Satoshi Sugimura (Tokyo University of Agriculture and Technology), Dr. Kazuo Yamagata (Kindai University), Mr. Tatsuma Yao (FUSO Pharmaceutical Industries, Ltd.), and Dr. Satoko Matoba (NARO), now report in the journal Scientific Reports, that successful nonvisible selection of bovine IVF embryos without chromosomal abnormalities by long term "live cell imaging" with fluorescence confocal microscopy. The chromosomal abnormality was detected by injection of mRNA encoding histone H2B-mCherry and EGFP-α-tubulin. The live-cell imaging revealed that about half of embryos judged as morphologically transferable by IETS criteria had nuclear/chromosomal abnormalities, such as abnormal number of pronuclei and abnormal chromosome segregation, which may lead abortion. All of two recipients transferred the embryos without any nuclear/chromosomal abnormalities got pregnant. It should be expected to improve the pregnancy success by selecting bovine IVF embryos with the live-cell imaging.

Credit: 
Tokyo University of Agriculture and Technology

Study reveals that many oncologists recommend medical marijuana clinically despite not feeling sufficiently knowledgeable to do so

While a wide majority of oncologists do not feel informed enough about medical marijuana's utility to make clinical recommendations, most do in fact conduct discussions on medical marijuana in the clinic and nearly half recommend it to their patients, say researchers who surveyed a population-based sample of medical oncologists.

The study, published today in the Journal of Clinical Oncology, is the first nationally-representative survey of medical oncologists to examine attitudes, knowledge and practices regarding the agent since medical marijuana became legal on the state level in the U.S. Medical marijuana refers to the non-pharmaceutical cannabis products that healthcare providers recommend for therapeutic purposes. A significant proportion of medical marijuana products are whole-plant marijuana, which contains hundreds of active ingredients with complicated synergistic and inhibitory interactions. By contrast, cannabinoid pharmaceuticals, which are available with a prescription through a pharmacy, contain no more than a couple of active ingredients. While considerable research has gone into the development of cannabinoid pharmaceuticals, much less has been completed on medical marijuana's utility in cancer and other diseases. The researchers speculate that the immature scientific evidence base poses challenges for oncologists.

"In this study, we identified a concerning discrepancy: although 80% of the oncologists we surveyed discussed medical marijuana with patients and nearly half recommended use of the agent clinically, less than 30% of the total sample actually consider themselves knowledgeable enough to make such recommendations," said Ilana Braun, MD, chief of Dana-Farber Cancer Institute's Division of Adult Psychosocial Oncology. "We can think of few other instances in which physicians would offer clinical advice about a topic on which they do not feel knowledgeable. We suspect that this is at least partly due to the uncomfortable spot in which oncologists find themselves. Medical marijuana is legal in over half the states, with cancer as a qualifying condition in the vast majority of laws, yet the scientific evidence base supporting use of medical marijuana in oncology remains thin."

The mailed survey queried medical oncologists' attitudes toward medical marijuana's efficacy and safety in comparison with standard treatments; their practices regarding medical marijuana, including holding discussions with patients and recommending medical marijuana clinically; and whether they considered themselves sufficiently informed regarding medical marijuana's utility in oncology. Responses indicated significant differences in attitudes and practices based on non-clinical factors, for instance regional location in the U.S.

"Ensuring that physicians have a sufficient knowledge on which to base their medical recommendations is essential to providing high quality care, according to Eric G. Campbell, PhD, formerly a professor of medicine at the Massachusetts General Hospital, now a professor at the University of Colorado School of Medicine. "Our study suggests that there is clearly room for improvement when it comes to medical marijuana."

To date, no randomized clinical trials have examined whole-plant medical marijuana's effects in cancer patients, so oncologists are limited to relying on lower quality evidence, research on pharmaceutical cannabinoids or research on medical marijuana's use in treating diseases other than cancer.

Of note, additional findings of the current study suggest that nearly two-thirds of oncologists believe medical marijuana to be an effective adjunct to standard pain treatment, and equally or more effective than the standard therapies for symptoms like nausea or lack of appetite, common side effects of cancer treatments such as chemotherapy.

Credit: 
Dana-Farber Cancer Institute

Angry birds: Size of jackdaw mobs depends on who calls warning

image: Jackdaws.

Image: 
Victoria Lee

Jackdaws recognise each other's voices and respond in greater numbers to warnings from familiar birds than strangers, new research shows.

The birds produce a harsh "scolding call" when they spot a predator, calling fellow jackdaws to mob the intruder and drive it away.

University of Exeter researchers have discovered that each bird has a unique call, and the size of the mob depends on which bird calls the warning.

The scientists played recordings of individual calls and found that the largest mobs assembled when birds heard the cry of a member of their own colony.

"Joining a mobbing event can be dangerous, as it involves approaching a predator, so it makes sense for individuals to be selective in whom they join. Our results show that jackdaws use the ability to discriminate between each other's voices when deciding whether to join in potentially risky collective activities," said Dr Alex Thornton, of the Centre for Ecology and Conservation on the University of Exeter's Penryn Campus in Cornwall.

"We also found a positive feedback loop - if birds joining a mob made alarm calls of their own, this in turn caused more birds to join in, magnifying the size of the mob."

The researchers studied wild jackdaws, a highly social member of the crow family, as part of the Cornish Jackdaw Project, a long-term study of jackdaw behaviour and cognition in sites across Cornwall.

In playbacks at nest-box colonies during the breeding season, they broadcast the warning calls of a resident from each nest-box, another member of the same colony, a member of a different colony, and a rook (a species that often associates with jackdaws).

Jackdaws were most likely to respond to a warning from a bird from the resident nest-box owner, followed in turn by other colony members, non-colony members and rooks.

Responses were also influenced by caller sex, with jackdaws less likely to echo a warning if the caller was a female stranger from a different colony.

The paper, published in the journal Scientific Reports, is entitled: "Caller characteristics influence recruitment to collective anti-predator events in jackdaws."

Credit: 
University of Exeter

Researchers hide information in plain text

image: Someone using FontCode would supply a secret message and a carrier text document. FontCode converts the secret message to a bit string (ASCII or Unicode) and then into a sequence of integers. Each integer is assigned to a five-letter block in the regular text where the numbered locations of each letter sum to the integer.

Image: 
Changxi Zheng/Columbia Engineering

New York, NY--May 10, 2018--Computer scientists at Columbia Engineering have invented FontCode, a new way to embed hidden information in ordinary text by imperceptibly changing, or perturbing, the shapes of fonts in text. FontCode creates font perturbations, using them to encode a message that can later be decoded to recover the message. The method works with most fonts and, unlike other text and document methods that hide embedded information, works with most document types, even maintaining the hidden information when the document is printed on paper or converted to another file type. The paper will be presented at SIGGRAPH in Vancouver, British Columbia, August 12-16.

"While there are obvious applications for espionage, we think FontCode has even more practical uses for companies wanting to prevent document tampering or protect copyrights, and for retailers and artists wanting to embed QR codes and other metadata without altering the look or layout of a document," says Changxi Zheng, associate professor of computer science and the paper's senior author.

Zheng created FontCode with his students Chang Xiao (PhD student) and Cheng Zhang MS'17 (now a PhD student at UC Irvine) as a text steganographic method that can embed text, metadata, a URL, or a digital signature into a text document or image, whether it's digitally stored or printed on paper. It works with common font families, such as Times Roman, Helvetica, and Calibri, and is compatible with most word processing programs, including Word and FrameMaker, as well as image-editing and drawing programs, such as Photoshop and Illustrator. Since each letter can be perturbed, the amount of information conveyed secretly is limited only by the length of the regular text. Information is encoded using minute font perturbations--changing the stroke width, adjusting the height of ascenders and descenders, or tightening or loosening the curves in serifs and the bowls of letters like o, p, and b.

"Changing any letter, punctuation mark, or symbol into a slightly different form allows you to change the meaning of the document," says Xiao, the paper's lead author. "This hidden information, though not visible to humans, is machine-readable just as barcodes and QR codes are instantly readable by computers. However, unlike barcodes and QR codes, FontCode doesn't mar the visual aesthetics of the printed material, and its presence can remain secret."

Data hidden using FontCode can be extremely difficult to detect. Even if an attacker detects font changes between two texts--highly unlikely given the subtlety of the perturbations--it simply isn't practical to scan every file going and coming within a company.

Furthermore, FontCode not only embeds but can also encrypt messages. While the perturbations are stored in a numbered location in a codebook, their locations are not fixed. People wanting to communicate through encrypted documents would agree on a private key that specifies the particular locations, or order, of perturbations in the codebook.

"Encryption is just a backup level of protection in case an attacker can detect the use of font changes to convey secret information," says Zheng. "It's very difficult to see the changes, so they are really hard to detect--this makes FontCode a very powerful technique to get data past existing defenses."

FontCode is not the first technology to hide a message in text--programs exist to hide messages in PDF and Word files or to resize whitespace to denote a 0 or 1--but, the researchers say, it is the first to be document-independent and to retain the secret information even when a document or an image with text (PNG, JPG) is printed or converted to another file type. This means a FrameMaker or Word file can be converted to PDF, or a JPEG can be converted to PNG, all without losing the secret information.

To use FontCode, you would supply a secret message and a carrier text document. FontCode converts the secret message to a bit string (ASCII or Unicode) and then into a sequence of integers. Each integer is assigned to a five-letter block in the regular text where the numbered codebook locations of each letter sum to the integer.

Recovering hidden messages is the reverse process. From a digital file or from a photograph taken with a smartphone, FontCode matches each perturbed letter to the original perturbation in the codebook to reconstruct the original message.

Matching is done using convolutional neural networks (CNNs). Recognizing vector-drawn fonts (such as those stored as PDFs or created with programs like Illustrator) is straightforward since shape and path definitions are computer-readable. However, it's a different story for PNG, IMG, and other rasterized (or pixel) fonts, where lighting changes, differing camera perspectives, or noise or blurriness may mask a part of the letter and prevent an easy recognition.

While CNNs are trained to take into account such distortions, recognition errors will still occur, and a key challenge for the researchers was ensuring a message could always be recovered in the face of such errors. Redundancy is one obvious way to recover lost information, but it doesn't work well with text since redundant letters and symbols are easy to spot.

Instead, the researchers turned to the 1700-year-old Chinese Remainder Theorem, which identifies an unknown number from its remainder after it has been divided by several different divisors. The theorem has been used to reconstruct missing information in other domains; in FontCode, researchers use it to recover the original message even when not all letters are correctly recognized.

"Imagine having three unknown variables," says Zheng. "With three linear equations, you should be able to solve for all three. If you increase the number of equations from three to five, you can solve the three unknowns as long as you know any three out of the five equations."

Using the Chinese Remainder theory, the researchers demonstrated they could recover messages even when 25% of the letter perturbations were not recognized. Theoretically the error rate could go higher than 25%.

The authors, who have filed a patent with Columbia Technology Ventures, plan to extend FontCode to other languages and character sets, including Chinese.

"We are excited about the broad array of applications for FontCode," says Zheng, "from document management software, to invisible QR codes, to protection of legal documents. FontCode could be a game changer."

Credit: 
Columbia University School of Engineering and Applied Science

Using proteomics to understand pathogens

image: A recent study in the journal Molecular & Cellular Proteomics investigated how virulent strains of Aspergillus fumigatus, an opportunistic pathogen, disrupt innate immune signaling in macrophages.

Image: 
CDC/Dr. Libero Ajello, via CDC Public Health Image Library

Recent studies in the journal Molecular & Cellular Proteomics have shed light on pathogenic mechanisms of the sexually-transmitted parasite Trichomonas vaginalis and the HIV-associated opportunistic lung fungus Aspergillus.

Fatty acid addition lets a parasite stick

Trichomonas vaginalis, commonly known as trich, is the most common curable sexually transmitted disease, and the vast majority of people with the disease - upwards of 70 percent - do not experience symptoms, according to the Centers of Disease Control and Prevention. However, the protozoan parasite can increase an infected person's risk of contracting HIV or developing cancer, and can cause preterm labor in pregnant women. In other parasites, a protein modification called palmitoylation, the addition of a 16-carbon saturated fatty acid to cysteine residues of a protein, regulates infectivity. In a new paper in the journal Molecular & Cellular Proteomics, researchers at the Instituto Tecnologico de Chascomus in Buenos Aires, and the University of California, Los Angeles, enriched palmitoylated proteins from T. vaginalis and found numerous palmitoylation sites in pathogenesis-related proteins. Yesica Nievas and colleagues report that disrupting palmitoylation reduced the protist's self-aggregation and adhesion to host cells. This work establishes the importance of palmitoylation in T. vaginalis proteins for infection and suggests that palmitoylation enzyme inhibitors may help treat the infection. DOI: 10.1074/mcp.RA117.000018

How a fungus outfoxes the macrophage

Aspergillus fumigatus is an opportunistic pathogen in the lung. People with compromised immune systems, either from disease or immune-suppression therapy, are especially vulnerable to the airborne mold spores, called conidia. In a healthy person, macrophages in the alveoli take up the fungus and normally an acidic organelle called the phagolysosome destroys it. Conidia from A. fumigatus, however, disrupt acidification of the phagolysosome and prevent the infected macrophage from committing suicide through apoptosis. Researchers at the Leibniz Institute for Natural Product Research and Infection Biology in Germany investigated the immune evasion strategies the pathogen uses by infecting cultured macrophages with magnetically tagged Aspergillus conidia from a virulent strain or a less infectious mutant strain. Hella Schmidt and colleagues then extracted phagolysosomes from macrophages infected with each strain and compared their proteomes. In a paper in the journal Molecular & Cellular Proteomics, the team reported that the more virulent strain reduces maturation of the phagolysosome and proinflammatory immune signaling. These disruptions ensured that the more virulent strain had a comfortable place to survive inside the host. Doi: 10.1074/mcp.RA117.000069.

Credit: 
American Society for Biochemistry and Molecular Biology

Women seeking crowdfunding financing for start-ups are perceived as more trustworthy

BLOOMINGTON, Ind. -- While men have benefited from a gender bias against women when seeking financing for business start-ups, the opposite may be true for female entrepreneurs seeking initial investment through crowdfunding efforts, according to research from Indiana University's Kelley School of Business.

Historically, female-led ventures have found it difficult to procure private equity, bank financing and venture capital, often because of psychological stereotypes inferring that business leaders should be masculine. But an article in print at the Journal of Business Venturing reports that gender bias may make crowdfunding investors more likely to invest in new ventures led by women.

"Our results show that on average, crowdfunders think female entrepreneurs are more trustworthy than male entrepreneurs," said Regan Stevenson, assistant professor of management and entrepreneurship at Kelley. "These judgments increase overall investment in female-led ventures over male-led ventures.

"When a crowdfunder holds high levels of implicit gender bias, the funder is actually more likely to invest in a woman because they perceive the woman as trustworthy," Stevenson said. "This is the opposite effect of what has been demonstrated in prior research in the venture capital setting."

Using three years of data from the crowdfunding platform Kickstarter, Stevenson and his colleagues examined investor stereotypes and implicit bias in crowdfunding decisions. Through a study sample of 416 projects, they examined the entrepreneur's gender, the financial backing received and funding success.

They found that women were more likely than men to have their Kickstarter projects funded. To discern why this was the case, the researchers conducted an experiment with 73 amateur investors based in the eastern U.S.

The results from the second half of the study supported the researchers' view that perceptions about trustworthiness facilitate financial backing and that implicit gender bias influenced funders' willingness to fund female entrepreneurs.

This is significant because earlier research has found that female-led firms receive only 1.3 percent of venture capital financing. They are often asked to surrender a greater proportion of ownership when receiving private funding. But in crowdfunding -- where a "crowd" of amateur investors make small investments in new companies -- empirical observations revealed a funding advantage for women.

"Previous research in venture capital setting has advocated that women should downplay feminine characteristics to increase their chances of obtaining funding," Stevenson said. "This advice not only appears to be dogmatic, but our data shows it is simply bad advice for female entrepreneurs when they are obtaining funding through crowdfunding platforms."

The paper also shows that crowdfunding provides a means for building trust with funders, which later may improve women's broader entrepreneurial prospects.

"These findings demonstrate an alternative pathway for women entrepreneurs that may allow them to overcome the negative aspects of gender bias and

Credit: 
Indiana University

More than one day of early-pregnancy bleeding linked to lower birthweight

Women who experience vaginal bleeding for more than one day during the first trimester of pregnancy may be more likely to have a smaller baby, compared to women who do not experience bleeding in the first trimester, suggest researchers at the National Institutes of Health. On average, full-term babies born to women with more than one day of bleeding in the first trimester were about 3 ounces lighter than those born to women with no bleeding during this time. Additionally, infants born to women with more than a day of first trimester bleeding were roughly twice as likely to be small for gestational age, a category that includes infants who are healthy but small, as well as those whose growth has been restricted because of insufficient nutrition or oxygen or other causes.

Credit: 
NIH/Eunice Kennedy Shriver National Institute of Child Health and Human Development

The joy of neurons: A simplified 'cookbook' for engineering brain cells to study disease

image: Human induced neurons generated from a novel transcription factor pair Neurog3/Pit1. Colors represent a nuclear stain (DAPI) in blue, synaptic marker expression (Synapsin1) in green, and neuron-specific beta-III tubulin expression (Tuj1) in red.

Image: 
Baldwin / The Scripps Research Institute

LA JOLLA, CA - May 9, 2018 - Scientists at The Scripps Research Institute have devised what they call a "neuronal cookbook" for turning skin cells into different types of neurons. As reported today in the journal Nature, the research opens the door to studying common brain conditions such as autism, schizophrenia, addiction and Alzheimer's disease under reproducible conditions in a dish.

"The brain is incredibly complex, with thousands of different types of cells that are each involved in different diseases," says Kristin Baldwin, PhD, professor at Scripps Research and senior author of the study. "The problem with understanding and treating the many disorders of the brain is that we cannot reproducibly produce the right types of brain cells. Now we have found more than 75 new ways to rapidly and reproducibly turn skin cells into neurons that we think will be much better representatives of different neurologic diseases than were previously available.

"Having a personalized and nearly unlimited supply of different types of neuronal cells in a dish lets you uncover what's going wrong in a disease. At the same time, the study supplies a new toolkit to test thousands of drugs on the affected cells to try to reverse the problems, rather than having to test them in mice or other animals, with results that are often difficult to interpret for human conditions," Baldwin adds.

The Baldwin lab wondered if it would be possible to both simplify and expand the coding toolbox for making neuronal cells directly from skin cells. First author Rachel Tsunemoto, PhD, had hints from a previous study that it might be possible to generate very specific types of neurons using only two transcription factors at a time. So, she and other lab members engineered and tested a large set of two factor codes to see if they could convert skin cells into cells with the essential core traits of neurons, such as their shape and electrical excitability.

While they had expected to find a handful of new factors, or possibly none at all, results of their large screen were quite surprising. Of the almost 600 factors tested, over 12 percent ended up producing neurons--leading to more than 70 new recipes or codes for neuronal production.

Next came an even bigger surprise. The "synthetic neurons," as Baldwin calls them, started to grow synapses and try to communicate with each other. "This came just two to three weeks after they were fibroblast cells that would normally never communicate," says Baldwin.

"This was really great to see," says Tsunemoto, co-first author of the study and a researcher with Scripps Research and the University of California, San Diego at the time of the study. "But obviously, there was a lot more follow-up work to be done to understand this exciting neuronal complexity in a dish in more detail."

For years, the challenge has been to see past the traits that neurons share--like the ability to communicate using connections called synapses--and figure out why certain neurons have special properties, like the ability to produce dopamine or respond to neuroactive drugs such as nicotine, that also correlate with their involvement in different diseases.

Together with Sohyon Lee, a co-first author of the study and a recent PhD graduate at Scripps Research, Tsunemoto began to sort through the "outputs" of the different codes using traditional electrical recording methods and new sensitive sequencing methods to see whether the codes would produce neurons with different features.

What they found was exciting. Each code produced a set of neurons with different properties, some of which seemed immediately useful for understanding how differences in our genetic makeups can predispose us to neurologic diversity in disorders such as autism, nicotine addiction or neurodegeneration.

Baldwin notes that this work builds on years of research from her lab and around the world. Previous studies led by Nobel laureate Shinya Yamanaka and Marius Wernig's group at Stanford University recently showed that sets of three to four factors could convert skin cells into pluripotent stem cells and directly into neurons. Previous work from Baldwin's lab showed that sets of just two factors could selectively generate specific kinds of neurons that respond to pain and itch.

The new study is a big step forward in cellular reprogramming. The transcription factors to make neurons are like codes," Baldwin says, and now researchers can input these codes to get the precise neuronal types they need for research, over and over again, eliminating the need for very precious human samples obtained from brain surgeries (which can be studied for only a few hours).

"Now we can be better genome detectives," says Baldwin. "Building up a database of these codes and the types of neurons they produce can help us directly link genomic studies of human brain disease to a molecular understanding of what goes wrong with neurons, which is the key to finding and targeting treatments.

"We're already working with collaborators and writing grants to apply this platform to autism and Alzheimer's disease research, as well as to some rare diseases, such as Friedrich's ataxia," adds Baldwin. "We also think the transcription factor pairs we studied here are only the tip of the iceberg and there are likely many more codes for neuronal types that can be found."

Researchers are encouraging other scientists to use their results through a freely available platform called BioGPS, run by Scripps Research scientists. "There's so much to explore," says Tsunemoto.

Credit: 
Scripps Research Institute

Atmospheric seasons could signal alien life

image: Satellites monitor how 'greenness' changes with Earth's seasons. UCR scientists are studying the accompanying changes in atmospheric composition as a marker for life on distant planets.

Image: 
NASA

RIVERSIDE, Calif. (http://www.ucr.edu) --Dozens of potentially habitable planets have been discovered outside our solar system, and many more are awaiting detection.

Is anybody -- or anything -- there?

The hunt for life in these places, which are impossible to visit in person, will begin with a search for biological products in their atmospheres. These atmospheric fingerprints of life, called biosignatures, will be detected using next-generation telescopes that measure the composition of gases surrounding planets that are light years away.

It's a tricky business, since biosignatures based on single measurements of atmospheric gases could be misleading. To complement these markers, and thanks to funding from the NASA Astrobiology Institute, scientists at the University of California, Riverside's Alternative Earths Astrobiology Center are developing the first quantitative framework for dynamic biosignatures based on seasonal changes in the Earth's atmosphere.

Titled "Atmospheric Seasonality As An Exoplanet Biosignature," a paper describing the research was published today in The Astrophysical Journal Letters. The lead author is Stephanie Olson, a graduate student in UCR's Department of Earth Sciences.

As Earth orbits the sun, its tilted axis means different regions receive more rays at different times of the year. The most visible signs of this phenomenon are changes in the weather and length of the days, but atmospheric composition is also impacted. For example, in the Northern Hemisphere, which contains most of the world's vegetation, plant growth in summer results in noticeably lower levels of carbon dioxide in the atmosphere. The reverse is true for oxygen.

"Atmospheric seasonality is a promising biosignature because it is biologically modulated on Earth and is likely to occur on other inhabited worlds," Olson said. "Inferring life based on seasonality wouldn't require a detailed understanding of alien biochemistry because it arises as a biological response to seasonal changes in the environment, rather than as a consequence of a specific biological activity that might be unique to the Earth." Further, extremely elliptical orbits rather than axis tilt could yield seasonality on extrasolar planets, or exoplanets, expanding the range of possible targets.

In the paper, the researchers identify the opportunities and pitfalls associated with characterizing the seasonal formation and destruction of oxygen, carbon dioxide, methane, and their detection using an imaging technique called spectroscopy. They also modeled fluctuations of atmospheric oxygen on a life-bearing planet with low oxygen content, like that of Earth billions of years ago. They found that ozone (O3), which is produced in the atmosphere through reactions involving oxygen gas (O2) produced by life, would be a more easily measurable marker for the seasonal variability in oxygen than O2 itself on weakly oxygenated planets.

"It's really important that we accurately model these kinds of scenarios now, so the space and ground-based telescopes of the future can be designed to identify the most promising biosignatures," said Edward Schwieterman, a NASA Postdoctoral Program fellow at UCR. "In the case of ozone, we would need telescopes to include ultraviolet capabilities to easily detect it."

Schwieterman said the challenge in searching for life is the ambiguity of data collected from so far away. False positives -- nonbiological processes that masquerade as life -- and false negatives -- life on a planet that produces few or no biosignatures -- are both major concerns.

"Both oxygen and methane are promising biosignatures, but there are ways they can be produced without life," Schwieterman said.

Olson said observing seasonal changes in oxygen or methane would be more informative.

"A potentially powerful way to assess exoplanets for inhabitation would be to observe their atmospheres throughout their orbits to see if we can detect changes in these biosignature gases over the course of a year," she said. "In some circumstances, such changes would be difficult to explain without life and may even allow us to make progress towards characterizing, rather than simply recognizing, life on an exoplanet."

Timothy Lyons, a professor of biogeochemistry in UCR's Department of Earth Science and director of the Alternative Earths Astrobiology Center, said this work advances the fundamental approach to searching for life on very distant planets.

"We are particularly excited about the prospect of characterizing oxygen fluctuations at the low levels we would expect to find on an early version of Earth," Lyons said. "Seasonal variations as revealed by ozone would be most readily detectable on a planet like Earth was billions of years ago, when most life was still microscopic and ocean dwelling."

Credit: 
University of California - Riverside

Genetic counseling and testing proposed for patients with the brain tumor medulloblastoma

image: The Lancet Oncology identified six genes that predispose carriers to develop the brain tumor medulloblastoma.

Image: 
St. Jude Children's Research Hospital

Researchers have identified six genes that predispose carriers to develop the brain tumor medulloblastoma and have used the discovery to craft genetic counseling and screening guidelines. The study appears today in the journal The Lancet Oncology.

St. Jude Children's Research Hospital, Hopp Children's Cancer Center at the NCT Heidelberg (KiTZ), Germany, and The Hospital for Sick Children, Toronto, led the research.

Medulloblastoma is the most common malignant childhood brain tumor and one of the leading causes of non-accidental death in U.S. children and adolescents. The tumor includes four main molecular subgroups with different clinical and biological characteristics as well as treatment outcomes. Except in rare cases associated with genetic disorders like Li-Fraumeni syndrome or Gorlin syndrome, medulloblastoma was thought to occur sporadically by chance, usually in infants and children less than 16 years old.

But researchers completed apparently the largest analysis yet of genetic predisposition in a pediatric brain tumor and found that germline variations in six genes often play a role. The genes include APC, BRCA2 and TP53, which are also associated with an elevated risk for breast, colon, ovarian and other cancers. The findings led the researchers to develop screening and counseling recommendations for patients based on the medulloblastoma molecular subgroups--WNT, sonic hedgehog and group 3 and group 4.

The newly identified predisposition genes account for about 20 percent of the sonic hedgehog subgroup and about 5 percent of cases overall. Germline variations are usually inherited and carried in cells throughout the body.

"One in five patients with sonic hedgehog medulloblastoma had clear germline predispositions that put them and possibly their siblings at risk for developing medulloblastoma and other cancers later in life," said Paul Northcott, Ph.D., an assistant member of the St. Jude Department of Developmental Neurobiology. He and Sebastian Waszak, Ph.D., of the European Molecular Biology Laboratory, Heidelberg, are co-first authors.

The contribution of germline variations in other medulloblastoma subgroups ranged from less than 5 percent in group 3 or group 4 medulloblastoma patients to about 10 percent of patients with WNT medulloblastoma. Overall, long-term survival is about 70 percent for patients with medulloblastoma, but ranges widely from 95 percent for WNT medulloblastoma to 50 percent for patients with group 3.

"Overall, half the patients with damaging germline variations were not identified based on their family cancer histories, which clinicians have depended on," Northcott said. "That highlights the urgent need to make genetic counseling and testing the standard of care for some medulloblastoma patients, particularly those in the sonic hedgehog and WNT subgroups."

Co-author Giles Robinson, M.D., an assistant member of the St. Jude Department of Oncology said: "The screenings can help patients and families understand and manage their lifetime cancer risk."

Identifying predisposition genes

Along with APC, BRCA2 and TP53, the other predisposition genes identified in this study were PALB2, PTCH1 and SUFU. The gene variations are predicted to change the encoded protein and disrupt the genes' normal function.

To find the high-risk genes, researchers compared the prevalence of rare variations in 110 known cancer predisposition genes in medulloblastoma patients from three continents to individuals without cancer. The analysis included whole genome and whole exome sequencing data of 1,022 patients with medulloblastoma. The exome is the portion of the genome that encodes instructions for protein assembly.

The samples included tumor and normal tissue from 800 of the 1,022 patients. Investigators reported no significant difference in patients screened prospectively or retrospectively.

The comparison group included exome sequencing data from more than 58,000 individuals with no cancer diagnoses.

Screening and testing guidelines

The findings led to the following subgroup-based counseling and screening recommendations.

WNT - Genetic counseling about possible high-risk APC germline variations should be offered to certain patients with WNT medulloblastoma. Those are patients without tumor cell (somatic) mutations in the gene CTNNB1, which codes for the protein β-catenin.

Sonic hedgehog - Genetic counseling and testing should be offered to all patients in this subgroup for some or all of the following genes: SUFU, PTCH1, TP53, PALB2 and BRCA2. The analysis also revealed that high-risk germline TP53 variations were a risk factor for medulloblastoma treatment failure.

Group 3 and Group 4 - These subgroups account for 65 to 70 percent of all medulloblastoma, but less than 5 percent of cases were associated with the cancer predisposition genes. Researchers recommend counseling and testing of PALB2 and BRCA2 in patients with family histories of breast, ovarian or other cancers associated with mutations in BRCA genes.

Credit: 
St. Jude Children's Research Hospital

The Big Bell Test: Participatory science puts quantum physics to the test

image: The Big Bell Quest video game.

Image: 
ICFO

Quantum chance is intrinsically different than classic chance. That is what the violations of Bell inequalities, a crucial step in understanding quantum mechanics, states. One drawback remains though: until now, testing these inequalities relied on experimental configurations that use parameters set from data generated by quantum systems. Effectively it was testing quantum physics using quantum physics. To overcome this problem, an international collaboration created by The Institute of Photonic Sciences in Barcelona, including twelve laboratories on five continents, including Institut de Physique de Nice (CNRS/Université Nice Sophia Antipolis), conducted a unique participatory science experiment. By gathering about 100,000 people worldwide through a video game, the researchers circumvented the data generation problem and rigorously validated their experimental observations on the violation of Bell inequalities. The results were published in Nature on 10 May 2018.

In physics, the principle of local realism states that two distant objects can only have limited correlations: events that one of the two object undergo cannot be correlated to the other beyond a certain degree. During the 20th century, John Stewart Bell formulated this limit between physical objects in mathematical inequalities. However, quantum objects did not follow this rule. In fact, events between quantum particles are correlated, wherever they are in the universe. This observation violates Bell inequalities and therefore the principle of local realism. To explain this phenomenon, conservative physicists in the early 20th century -- including Einstein -- had made the hypothesis that unknown physical parameters existed, such that the constraint imposed by inequalities would be correct all the same.

Until now, researchers had only managed to demonstrate the violation of Bell inequalities using data generated by quantum systems to set the parameters for their experiments. To test the correlation between intertwined particles, each of them must be observed randomly, without measurements on two particles being related in any way at all. To achieve this, random quantum bit generators[1] gave instructions to observation machines. This meant testing quantum physics using a quantum system. To get out of this paradox, 100,000 people contributed to generate bits randomly by a non-quantum mechanism. The resulting data generated a code that arbitrarily configured measurement instruments for intertwined particles in 13 experiments spread among 12 laboratories and as many countries.

To generate these bits, the cohort of participants was invited to participate in a video game: The Big Bell Quest. To progress in the scenario, players pressed randomly on the keyboard's 1 and 0 keys. These bits were sent directly to the laboratories. On November 30, 2016, the players generated more than 97 million bits that continuously fed the experiments for 12 consecutive hours. As the game progressed, the players were able to discover scientific mediation content such as explanations on Bell inequalities and the way they are tested, accompanied by videos made in the laboratories who were receiving the data.

The results of the experiments confirm the violation of Bell inequalities by a more consistent and rigorous methodology than before. They also open the path to deeper quantum physics applications. The fundamental principles of intrication do play an essential role in the development of quantum cryptography--quantum computing. As for the methodology, it proves that participatory sciences can play a useful role in fundamental physics.

Credit: 
CNRS

Should the number of GP's patient consultations be capped?

The British Medical Association recently proposed guidance to cap the number of patients a GP sees each day to prevent unsafe working levels, but should this be recommended? Experts debate the issue in The BMJ today.

Limits to workload could protect GPs and patients in a system that has become dangerous, says Laurence Buckman, a GP partner in London.

10 minute consultations are too short for the amount of work patients' needs require and the pressure to perform better and longer for more patients is now dangerous for both doctors and patients, he explains.

Genuine emergencies should not be limited, but most so called emergencies are for minor ailments or simple queries and none of these justify working into the evening. He adds: "Every problem is important to every patient, and we should recognise that, but we cannot keep giving until we might make a potentially serious error or become ill ourselves through overwork".

He points out that many GPs start out with a 'fixed number' of appointments - 18 consultations in each half day. "But we also have a policy to turn away nobody who says he or she is in need. We cope with this load" he says.

"The time has come when the public has to be told that it is unsafe for them to be seen when the GP is not thinking optimally, and that a tired GP risks harming patient - and doctor" he concludes.

But Michael Griffiths, a GP partner in South Wales, says caps inhibit professionalism and might themselves cause harm.

He points out that successive governments and management regimes have gradually transferred more work, until the [medical] profession feels overwhelmed by the excessive workload and patient safety is compromised.

"[Capping appointments] is the wrong way, because it limits our flexibility and professionalism when dealing with patients, but mainly because it does not address the question of bringing additional resources into primary care to manage work that we could undertake if properly funded.

"A cap may be a useful negotiating tool, but it should never become an end in itself. What is needed is a greater proportion of NHS resource coming to primary care to allow us to properly administer our practices, allowing the right professional enough time to devote to each patient," he concludes.

In a linked patient commentary, Jennifer Skillen, a frequent primary care patient, acknowledges that sometimes only a face to face appointment with a GP is suitable.

"A fifth of patients are already having to wait more than two weeks for an appointment, which is too long. Introducing caps to the number of daily consultations would likely make this unsafe situation worse.

"What the NHS needs is fundamental system change. We must look beyond short-term tweaks and develop long term strategies in the NHS that support GPs to support patients," she concludes.

Credit: 
BMJ Group

Many newborn screening recommendations do not assess key evidence on benefits and harms

Many national recommendations on whether to screen newborn babies for rare conditions do not assess the evidence on the key benefits and harms of screening, warn researchers in a study published by The BMJ today.

Effective screening programmes can save lives, whereas ineffective programmes can do more harm than good, yet decisions about which conditions to screen for vary widely between countries, despite similar populations and healthcare systems.

Reasons for these differences are unclear, but it has been suggested that differences in the evidence review process used to generate policy - in particular the use of systematic reviews - may play a role.

Systematic reviews bring together evidence from existing studies and use statistical methods to summarise the results, to help make evidence-based decisions.

To explore this further, a team of UK researchers assessed whether use of a systematic review affects national decisions on whether to screen for a range of conditions using the newborn blood spot test, which is offered to every baby to detect rare but serious health conditions.

Their analysis included 93 reports that assessed 104 conditions across 14 countries, giving a total of 276 recommendations.

Screening was favoured in 159 (58%) recommendations, not favoured in 98 (36%), and not recommended either way in 19 (7%).

Only 60 (22%) of the recommendations were based on evidence from a systematic review. Use of a systematic review was associated with a reduced probability of screening being recommended (38% v 63%).

Evidence for test accuracy was not considered in 115 (42%) of recommendations, while evidence around the benefits of early detection and the potential harm of overdiagnosis were not considered in 83 (30%) and 211 (76%) of recommendations, respectively.

The researchers point to some study limitations, the key one being that use of systematic review methods may have been driven by country level factors. However, strengths include the large number of documents analysed and the ability to take account of potentially influential factors across different conditions.

"This study showed that many national policy decisions about whether to screen for conditions are being made without systematically reviewing the evidence," say the authors. "Yet it remains essential to make evidence based policy decisions because once screening programmes are started they are difficult to stop."

They call for further research "to understand why policy makers do not employ systematic review methods in their evaluations of evidence" - and they propose more international collaboration to undertake such reviews.

Credit: 
BMJ Group

New research reveals how energy dissipates outside Earth's magnetic field

image: In this visualization, as the supersonic solar wind (yellow haze) flows around the Earth's magnetic field (blue wavy lines), it forms a highly turbulent boundary layer called the 'magnetosheath' (yellow swirling area). A new research paper describes observations of small-scale magnetic reconnection within the magnetosheath, revealing important clues about heating in the sun's outer layers and elsewhere in the universe.

Image: 
NASA/GSFC

Earth's magnetic field provides an invisible but crucial barrier that protects Earth from the solar wind--a stream of charged particles launched from the sun's outer layers. The protective properties of the magnetic field can fail due to a process known as magnetic reconnection, which occurs when two opposing magnetic field lines break and reconnect with each other, dissipating massive amounts of energy and accelerating particles that threaten air traffic and satellite communication systems.

Just outside of Earth's magnetic field, the solar wind's onslaught of electrons and ionized gases creates a turbulent maelstrom of magnetic energy known as the magnetosheath. While magnetic reconnection has been well documented closer to Earth, physicists have sought to determine whether reconnection also happens in this turbulent zone.

A new research paper co-authored by University of Maryland Physics Professor James Drake suggests that the answer to this question is yes. The observations, published in the May 10, 2018 issue of the journal Nature, provide the first evidence of magnetic reconnection occurring at very small spatial scales in the turbulent magnetosheath. However, unlike the reconnection that occurs with the Earth's magnetic field, which involves electrons as well as ions, turbulent reconnection in the magnetosheath involves electrons alone.

"We know that magnetic energy in churning, turbulent systems cascades to smaller and smaller scales. At some point that energy is fully dissipated. The big question is how that happens, and what role magnetic reconnection plays at such small scales," Drake said. "This study shows that reconnection indeed can happen at the electron scale, with no ions involved at all, suggesting that reconnection may help dissipate magnetic energy at very small scales."

By drawing a clearer picture of the physics of magnetic reconnection, the discovery promises to advance scientists' understanding of several open questions in solar physics. For example, electron-scale magnetic reconnection may play a role in heating of the solar corona--an expansive layer of charged particles that surrounds the sun and reaches temperatures hundreds of times higher than the sun's visible surface. This in turn could help explain the physics of the solar wind, as well as the nature of turbulent magnetic systems elsewhere in space.

NASA's Magnetospheric Multiscale (MMS) mission gathered the data for the analysis. Flying in a pyramid formation with as little as 4.5 miles' distance between four identical spacecraft, MMS imaged electrons within the pyramid once every 30 milliseconds. These highly precise measurements enabled the researchers to capture turbulent, electron-only magnetic reconnection, a phenomenon not previously observed.

"MMS discovered electron magnetic reconnection, a new process much different from the standard magnetic reconnection that happens in calmer areas around Earth," said Tai Phan, a senior fellow in the Space Sciences Laboratory at the University of California, Berkeley and the lead author of the paper. "This finding helps scientists understand how turbulent magnetic fields dissipate energy throughout the cosmos".

Because turbulent reconnection involves only electrons, it remained hidden from scientists looking for the telltale signature of standard magnetic reconnection: ion jets. Compared with standard reconnection, in which broad jets of ions stream out tens of thousands of miles from the site of reconnection, turbulent reconnection ejects narrow jets of electrons only a couple miles wide.

But MMS scientists were able to leverage the design of one instrument, the Fast Plasma Investigation, to create a technique that allowed them to read between the lines and gather extra data points to resolve the jets.

"The key event of the paper happens in 45 milliseconds. This would be one data point with the regular data," said Amy Rager, a graduate student at the Catholic University of America in Washington, D.C., who worked at NASA's Goddard Space Flight Center to develop the technique. "But instead we can get six to seven data points in that region with this method, allowing us to understand what is happening."

With the new method, MMS scientists are hopeful they can comb through existing data sets to find more of these events and other unexpected discoveries as well.

"There were some surprises in the data," Drake said. "Magnetic reconnection occurs when you have two magnetic fields pointing in opposite directions and they annihilate each other. In the present case a large ambient magnetic field survived after annihilation occurred. Honestly, we were surprised that turbulent reconnection at very small scales could occur with this background magnetic field present."

Magnetic reconnection occurs throughout the universe, so whatever scientists learn about it near Earth can be applied to other phenomena. For example, the discovery of turbulent electron reconnection may help scientists understand the role that magnetic reconnection plays in heating the inexplicably hot solar corona--the sun's outer atmosphere--and accelerating the supersonic solar wind. NASA's upcoming Parker Solar Probe mission will travel directly toward the sun in the summer of 2018 to investigate these questions--armed with a new understanding of magnetic reconnection near Earth.

Credit: 
University of Maryland

The weak side of the proton

image: The Q-weak experiment was conducted in Jefferson Lab's Experimental Hall C, and its goal was to very precisely measure the proton's weak charge, a term that quantifies the influence that the weak force can exert on protons. The Q-weak apparatus, shown here, was installed in the hall for the experimental run, which concluded in 2012.

Image: 
DOE's Jefferson Lab

A new result from the Q-weak experiment at the Department of Energy's Thomas Jefferson National Accelerator Facility provides a precision test of the weak force, one of four fundamental forces in nature. This result, published recently in Nature, also constrains possibilities for new particles and forces beyond our present knowledge.

"Precision measurements like this one can act as windows into a world of potential new particles that otherwise might only be observable using extremely high-energy accelerators that are currently beyond the reach of our technical capabilities," said Roger Carlini, a Jefferson Lab scientist and a co-spokesperson for the Q-weak Collaboration.

While the weak force is difficult to observe directly, its influence can be felt in our everyday world. For example, it initiates the chain of reactions that power the sun and it provides a mechanism for radioactive decays that partially heat the Earth's core and that also enable doctors to detect disease inside the body without surgery.

Now, the Q-weak Collaboration has revealed one of the weak force's secrets: the precise strength of its grip on the proton. They did this by measuring the proton's weak charge to high precision, which they probed using the high-quality beams available at the Continuous Electron Beam Accelerator Facility, a DOE Office of Science User Facility.

The proton's weak charge is analogous to its more familiar electric charge, a measure of the influence the proton experiences from the electromagnetic force. These two interactions are closely related in the Standard Model, a highly successful theory that describes the electromagnetic and weak forces as two different aspects of a single force that interacts with subatomic particles.

To measure the proton's weak charge, an intense beam of electrons was directed onto a target containing cold liquid hydrogen, and the electrons scattered from this target were detected in a precise, custom-built measuring apparatus. The key to the Q-weak experiment is that the electrons in the beam were highly polarized - prepared prior to acceleration to be mostly "spinning" in one direction, parallel or anti-parallel to the beam
direction. With the direction of polarization rapidly reversed in a controlled manner, the experimenters were able to latch onto the weak interaction's unique property of parity (akin to mirror symmetry) violation, in order to isolate its tiny effects to high precision: a different scattering rate by about 2 parts in 10 million was measured for the two beam polarization states.

The proton's weak charge was found to be QWp=0.0719±0.0045, which turns out to be in excellent agreement with predictions of the Standard Model, which takes into account all known subatomic particles and the forces that act on them. Because the proton's weak charge is so precisely predicted in this model, the new Q-weak result provides insight into predictions of hitherto unobserved heavy particles, such as those that may be produced by the Large Hadron Collider (LHC) at CERN in Europe or future high energy particle accelerators.

"This very challenging experimental result is yet another clue in the world-wide search for new physics beyond our current understanding. There is ample evidence the Standard Model of Particle physics provides only an incomplete description of nature's phenomena, but where the breakthrough will come remains elusive," said Timothy J. Hallman, Associate Director for Nuclear Physics of the Department of Energy Office of Science. "Experiments like Q-weak are pressing ever closer to finding the answer."

For example, the Q-weak result has set limits on the possible existence of leptoquarks, which are hypothetical particles that can reverse the identities of two broad classes of very different fundamental particles - turning quarks (the building blocks of nuclear matter) into leptons (electrons and their heavier counterparts) and vice versa.

"After more than a decade of careful work, Q-weak not only informed the Standard Model, it showed that extreme precision can enable moderate-energy experiments to achieve results on par with the largest accelerators available to science," said Anne Kinney, Assistant Director for the Mathematical and Physical Sciences Directorate at the National Science Foundation. "Such precision will be important in the hunt for physics beyond the Standard Model, where new particle effects would likely appear as extremely tiny deviations."

"It's complementary information. So, if they find evidence for new physics in the future at the LHC, we can help identify what it might be, from the limits that we're setting already in this paper," said Greg Smith, Jefferson Lab scientist and Q-weak project manager.

Credit: 
DOE/Thomas Jefferson National Accelerator Facility