Culture

Status report: OSIRIS-REx completes closest flyover of sample site nightingale

image: During the OSIRIS-REx Reconnaissance B flyover of primary sample collection site Nightingale, the spacecraft left its safe-home orbit to pass over the sample site at an altitude of 0.4 miles (620 m). The pass, which took 11 hours, gave the spacecraft's onboard instruments the opportunity to take the closest-ever science observations of the sample site.

Image: 
NASA/Goddard/University of Arizona

Preliminary results indicate that NASA's OSIRIS-REx spacecraft successfully executed a 0.4-mile (620-m) flyover of site Nightingale yesterday as part of the mission's Reconnaissance B phase activities. Nightingale, OSIRIS-REx's primary sample collection site, is located within a crater high in asteroid Bennu's northern hemisphere.

To perform the pass, the spacecraft left its 0.75-mile (1.2-km) safe home orbit and flew an almost 11-hour transit over the asteroid, aiming its science instruments toward the 52-ft (16-m) wide sample site before returning to orbit. Science observations from this flyover are the closest taken of a sample site to date.

The primary goal of the Nightingale flyover was to collect the high-resolution imagery required to complete the spacecraft's Natural Feature Tracking image catalog, which will document the sample collection site's surface features - such as boulders and craters. During the sampling event, which is scheduled for late August, the spacecraft will use this catalog to navigate with respect to Bennu's surface features, allowing it to autonomously predict where on the sample site it will make contact . Several of the spacecraft's other instruments also took observations of the Nightingale site during the flyover event, including the OSIRIS-REx Thermal Emissions Spectrometer (OTES), the OSIRIS-REx Visual and InfraRed Spectrometer (OVIRS), the OSIRIS-REx Laser Altimeter (OLA), and the MapCam color imager.

A similar flyover of the backup sample collection site, Osprey, is scheduled for Feb. 11. Even lower flybys will be performed later this spring - Mar. 3 for Nightingale and May 26 for Osprey - as part of the mission's Reconnaissance C phase activities. The spacecraft will perform these two flyovers at an altitude of 820 feet (250 m), which will be the closest it has ever flown over asteroid Bennu's surface.

Credit: 
NASA/Goddard Space Flight Center

Here, there and everywhere: Large and giant viruses abound globally

image: Art illustration capturing giant virus genomic diversity. This image complements a January 22, 2020, Nature paper led by researchers at the Department of Energy Joint Genome Institute uncovering a broad diversity of large and giant viruses that belong to the nucleocytoplasmic large DNA viruses (NCLDV) supergroup. The team reconstructed 2,074 genomes of large and giant viruses from more than 8,500 publicly available metagenome datasets generated from sampling sites around the world, and virus diversity in this group expanded 10-fold from just 205 genomes, redefining the phylogenetic tree of giant viruses.

Image: 
Zosia Rostomian/Berkeley Lab

While the microbes in a single drop of water could outnumber a small city's population, the number of viruses in the same drop--the vast majority not harmful to humans--could be even larger. Viruses infect bacteria, archaea and eukaryotes, and they range in particle and genome size from small, to large and even giant. The genomes of giant viruses are on the order of 100 times the size of what has typically been associated with viruses, while the genomes of large viruses may be only 10 times larger. And yet, while they are found everywhere, comparatively little is known about viruses, much less those considered large and giant.

In a recent study published in the journal Nature, a team led by researchers at the U.S. Department of Energy (DOE) Joint Genome Institute (JGI), a DOE Office of Science User Facility located at Lawrence Berkeley National Laboratory (Berkeley Lab) uncovered a broad diversity of large and giant viruses that belong to the nucleocytoplasmic large DNA viruses (NCLDV) supergroup. The expansion of the diversity for large and giant viruses offered the researchers insights into how they might interact with their hosts, and how those interactions may in turn impact the host communities and their roles in carbon and other nutrient cycles.

"This is the first study to take a more global look at giant viruses by capturing genomes of uncultivated giant viruses from environmental sequences across the globe, then using these sequences to make inferences about the biogeographic distribution of these viruses in the various ecosystems, their diversity, their predicted metabolic features and putative hosts," noted study senior author Tanja Woyke, who heads JGI's Microbial Program.

The team mined more than 8,500 publicly available metagenome datasets generated from sampling sites around the world, including data from several DOE-mission relevant proposals through JGI's Community Science Program. Proposals from researchers at Concordia University (Canada), University of Michigan, University of Wisconsin-Madison, and the Georgia Institute of Technology focused on microbial communities from freshwater ecosystems, including, respectively, the northern Lakes of Canada, the Laurentian Great Lakes, Lake Mendota and Lake Lanier were of particular interest.

Sifting Out and Reconstructing Virus Genomes

Much of what is known about the NCLDV group has come from viruses that have been co-cultivated with amoeba or with their hosts, though metagenomics is now making it possible to seek out and characterize uncultivated viruses. For instance, a 2018 study from a JGI-led team uncovered giant viruses in the soil for the first time. The current study applied a multi-step approach to mine, bin and then filter the data for the major capsid protein (MCP) to identify NCLDV viruses. JGI researchers previously applied this approach to uncover a novel group of giant viruses dubbed "Klosneuviruses."

Previously known members of the viral lineages in the NCLDV group infect mainly protists and algae, and some of them have genomes in the megabase range. The study's lead and co-corresponding author Frederik Schulz, a research scientist in Woyke's group, used the MCP as a barcode to sift out virus fragments, reconstructing 2,074 genomes of large and giant viruses. More than 50,000 copies of the MCP were identified in the metagenomic data, two-thirds of which could be assigned to viral lineages, and predominantly in samples from marine (55%) and freshwater (40%) environments. As a result, the giant virus protein space grew from 123,000 to over 900,000 proteins, and virus diversity in this group expanded 10-fold from just 205 genomes, redefining the phylogenetic tree of giant viruses.

Metabolic Reprogramming a Common Strategy for Large and Giant Viruses

Another significant finding from the study was a common strategy employed by both large and giant viruses. Metabolic reprogramming, Schulz explained, makes the host function better under certain conditions, which then helps the virus to replicate faster and produce more progeny. This can provide short- and long-term impact on host metabolism in general, or on host populations impacted by adverse environmental conditions. Function prediction on the 2,000 new giant virus genomes led the team to uncover a prevalence of encoded functions that could boost host metabolism, such as genes that play roles in the uptake and transport of diverse substrates, and also photosynthesis genes including potential light-driven proton pumps. "We're seeing that this is likely a common strategy among the large and giant viruses based on the predicted metabolism that's encoded in the viral genomes," he said. "It seems to be way more common than had been previously thought."

Woyke noted that despite the number of metagenome-assembled genomes (MAGs) reconstructed from this effort, the team was still unable to link 20,000 major capsid proteins of large and giant viruses to any known virus lineage. "Getting complete, near complete, or partial giant virus genomes reconstructed from environmental sequences is still challenging and even with this study we are likely to just scratch the surface of what's out there. Beyond these 2,000 MAGs extracted from 8,000 metagenomes, there are still a lot of giant virus diversity that we're missing in the various ecosystems. We can detect a lot more MCPs than we can extract MAGs, and they don't fit in the genome tree of viral diversity - yet."

"We expect this to change with not only new metagenome datasets becoming available but also complementary single-cell sorting and sequencing of viruses together with their unicellular hosts," Schulz added.

Credit: 
DOE/Lawrence Berkeley National Laboratory

3,000-year-old teeth solve Pacific banana mystery

image: The findings were made from 3,000-year-old skeletons at Teouma, the oldest archaeological cemetery in Remote Oceania, a region that includes Vanuatu and all of the Pacific Islands east and South, including Hawaii, Rapa Nui and Aotearoa.

Image: 
University of Otago

Humans began transporting and growing banana in Vanuatu 3000 years ago, a University of Otago scientist has discovered.

The discovery is the earliest evidence of humans taking and cultivating banana in to what was the last area of the planet to be colonised.

In an article published this week in Nature Human Behaviour, Dr Monica Tromp, Senior Laboratory Analyst at the University of Otago's Southern Pacific Archaeological Research (SPAR), found microscopic particles of banana and other plants trapped in calcified dental plaque of the first settlers of Vanuatu.

The finds came from 3000-year-old skeletons at the Teouma site on Vanuatu's Efate Island.

Dr Tromp used microscopy to look for 'microparticles' in the plaque, also known as dental calculus, scraped from the teeth of the skeletons. That allowed her to discover some of the plants people were eating and using to make materials like fabric and rope in the area when it was first colonised.

Teouma is the oldest archaeological cemetery in Remote Oceania, a region that includes Vanuatu and all of the Pacific islands east and south, including Hawaii, Rapa Nui and Aotearoa. The Teouma cemetery is unique because it is uncommon to find such well-preserved archaeological burials in the Pacific. Bone generally does not preserve in hot and humid climates and the same is true for things made of plant materials and also food.

The first inhabitants of Vanuatu were people associated with the Lapita cultural complex who originated in Island South East Asia and sailed into the Pacific on canoes, reaching the previously uninhabited islands of Vanuatu around 3000 years ago.

There has been debate about how the earliest Lapita people survived when they first arrived to settle Vanuatu and other previously untouched islands in the Pacific. It is thought Lapita people brought domesticated plants and animals with them on canoes - a transported landscape. But direct evidence for these plants had not been found at Teouma until Dr Tromp's study.

"One of the big advantages of studying calcified plaque or dental calculus is that you can find out a lot about otherwise invisible parts of people's lives," Dr Tromp says. Plaque calcifies very quickly and can trap just about anything you put inside of your mouth - much like the infamous Jurassic Park mosquito in amber - but they are incredibly small things that you can only see with a microscope."

The study began as part of Dr Tromp's PhD research in the Department of Anatomy and involved collaboration with the Vanuatu Cultural Centre, Vanuatu National Herbarium and the community of Eratap village - the traditional landowners of the Teouma site.

Dr Tromp spent hundreds of hours in front of a microscope finding and identifying microparticles extracted from thirty-two of the Teouma individuals. The positive identification of banana (Musa sp.) is direct proof it was brought with the earliest Lapita populations to Vanuatu.

Palm species (Arecaceae) and non-diagnostic tree and shrub microparticles were also identified, indicating these plants were also important to the lives of this early population, possibly for use as food or food wrapping, fabric and rope making, or for medicinal purposes, Dr Tromp says.

"The wide, and often unexpected range of things you can find in calcified plaque makes what I do both incredibly exciting and frustrating at the same time."

The article was co-authored by Elizabeth Matisoo-Smith, Rebecca Kinaston and Hallie Buckley of the University of Otago, and Stuart Bedford and Matthew Spriggs of the Australian National University . It can be found here: https://www.nature.com/articles/s41562-019-0808-y

Credit: 
University of Otago

Study uncovers unexpected connection between gliomas, neurodegenerative diseases

A protein typically associated with neurodegenerative diseases like Alzheimer's might help scientists explore how gliomas, a type of cancerous brain tumor, become so aggressive.

The new study, in mouse models and human brain tumor tissues, was published in Science Translational Medicine and found a significant expression of the protein TAU in glioma cells, especially in those patients with better prognoses.

Patients with glioma are given a better prognosis when their tumor expresses a mutation in a gene called isocitrate dehydrogenase 1 (IDH1). In this international collaborative study led by the Instituto de Salud Carlos III-UFIEC in Madrid, Spain, those IDHI mutations stimulated the expression of TAU. Then, the presence of TAU acted as a brake for the formation of new blood vessels, which are necessary for the aggressive behavior of the tumors.

"We report that the levels of microtubule-associated protein TAU, which have been associated with neurodegenerative diseases, are epigenetically controlled by the balance between normal and mutant IDH1/2 in mouse and human gliomas," says co-author Maria G. Castro, Ph.D., a professor of neurosurgery and cell and developmental biology at Michigan Medicine (University of Michigan). "In IDH1/2 mutant tumors, we found that expression levels of TAU decreased with tumor progression."

That means levels of TAU could be used as a biomarker for tumor progression in mutant IDH1/2 gliomas, Castro says.

Credit: 
Michigan Medicine - University of Michigan

Groups publish statements on CT contrast use in patients with kidney disease

OAK BROOK, Ill. - The risk of administering modern intravenous iodinated contrast media in patients with reduced kidney function has been overstated, according to new consensus statements from the American College of Radiology (ACR) and the National Kidney Foundation (NKF), published in the journal Radiology.

Intravenous iodinated contrast media are commonly used with computed tomography (CT) to evaluate disease and to determine treatment response. Although patients have benefited from their use, iodinated contrast media have been denied or delayed in patients with reduced kidney function due to the perceived risks of contrast-induced acute kidney injury. This practice can hinder a timely and accurate diagnosis in these patients.

"The historical fears of kidney injury from contrast-enhanced CT have led to unmeasured harms related to diagnostic error and diagnostic delay," said lead author Matthew S. Davenport, M.D., associate professor of radiology and urology at the University of Michigan in Ann Arbor, Michigan. "Modern data clarify that this perceived risk has been overstated. Our intent is to provide multi-disciplinary guidance regarding the true risk to patients and how to apply a consideration of that risk to modern clinical practice."

These consensus statements were developed to improve and standardize the care of patients with
impaired kidney function who may need to undergo exams that require intravenous iodinated contrast media to provide the clearest images and allow for the most informed diagnosis.

In clinical practice, many factors are used to determine whether intravenous contrast media should be administered. These include probability of an accurate diagnosis, alternative methods of diagnosis, risks of misdiagnosis, expectations about kidney function recovery, and risk of allergic reaction. Decisions are rarely based on a single consideration, such as risk of an adverse event specifically related to kidney impairment. Consequently, the authors advise that these statements be considered in the context of the entire clinical scenario.

Importantly, the report outlines the key differences between contrast-induced acute kidney injury (CI-AKI) and contrast-associated acute kidney injury (CA-AKI). In CI-AKI, a causal relationship exists between contrast media and kidney injury, whereas in CA-AKI, a direct causal relationship has not been demonstrated. The authors suggest that studies that have not properly distinguished the two have contributed to the overstatement of risk.

"A primary explanation for the exaggerated perceived nephrotoxic risk of contrast-enhanced CT is nomenclature," Dr. Davenport said. "'Contrast-induced' acute kidney injury implies a causal relationship. However, in many circumstances, the diagnosis of CI-AKI in clinical care and in research is made in a way that prevents causal attribution. Disentangling contrast-induced AKI (causal AKI) from contrast-associated AKI (correlated AKI) is a critical step forward in improving understanding of the true risk to patients."

The statements answer key questions and provide recommendations for use of intravenous contrast media in treating patients with varying degrees of impaired kidney function.

Although the true risk of CI-AKI remains unknown, the authors recommend intravenous normal saline for patients without contraindication, such as heart failure, who have acute kidney injury or an estimated glomerular filtration rate (eGFR) less than 30 mL/min per 1.73 m2 who are not undergoing maintenance dialysis. In individual and unusual high-risk circumstances (patients with multiple comorbid risk factors), prophylaxis may be considered in patients with an eGFR of 30-44 mL/min per 1.73 m2 at the discretion of the ordering clinician.

The presence of a solitary kidney should not independently influence decision making regarding the risk of CI-AKI. Lowering of contrast media dose below a known diagnostic threshold should be avoided due to the risk of lowering diagnostic accuracy. Also, when feasible, medications that are toxic to the kidneys should be withheld by the referring clinician in patients at high risk. However, renal replacement therapy should not be initiated or altered solely based on contrast media administration.

The authors emphasize that prospective controlled data are needed in adult and pediatric populations to clarify the risk of CI-AKI.

Credit: 
Radiological Society of North America

Visits to pediatricians on the decline

image: A new study lead by the University of Pittsburgh and UPMC Children's Hospital finds that commercially insured children are visiting the pediatrician less often.

Image: 
UPMC

PITTSBURGH, Jan. 21, 2020 - Commercially insured children in the U.S. are seeing pediatricians less often than they did a decade ago, according to a new analysis led by a pediatrician-scientist at the University of Pittsburgh and UPMC Children's Hospital of Pittsburgh.

But whether that's good or bad is unclear, the researchers say in the study, published today in JAMA Pediatrics.

"There's something big going on here that we need to be paying attention to," said lead author Kristin Ray, M.D., M.S., assistant professor of pediatrics in Pitt's School of Medicine. "The trend is likely a combination of both positive and negative changes. For example, if families avoid bringing their kids in because of worry about high co-pays and deductibles, that's very concerning. But if this is the result of better preventive care keeping kids healthier or perhaps more physician offices providing advice over the phone to support parents caring for kids at home when they've got minor colds or stomach bugs, that's a good thing."

Ray and her colleagues examined insurance claims data from 2008 through 2016 for children 17-years-old and younger. The data came from a large commercial health plan that covers millions of children across all 50 states with a range of benefit options.

In that time span, primary care visits for any reason decreased by 14%.

Preventive care, or "well child" visits, increased by nearly 10%. This change occurred during the years when the Affordable Care Act eliminated co-pays for such visits. But that increase was eclipsed by a much larger decrease in problem-based visits for things such as illness or injury, with these visits declining by 24%. Among problem-based visits, decreases were seen for all types of diagnoses, except for psychiatric and behavioral health visits, which increased by 42%.

"This means that children and their families are visiting their pediatrician less throughout the year, presumably resulting in fewer opportunities for the pediatrician to connect with families on preventive care and healthy behaviors, like vaccinations and good nutrition," said Ray, also a pediatrician and director of health system improvements at UPMC Children's Community Pediatrics. "The question is: Why? We don't have the definitive answer, but our data give us some clues."

One possible explanation is that children are getting care elsewhere. Visits to urgent care, retail clinics and telemedicine consults for problem-based care increased during the study period. But that increase accounted for only about half of the decrease in visits to primary care pediatricians.

Higher out-of-pocket costs probably also explain why some parents aren't taking their children to the pediatrician for medical concerns, Ray said. During the time period studied, out-of-pocket costs for problem-based visits increased 42%, while inflation-adjusted median household income rose by only 5%. Previous studies have found that even $1-$10 increases in copayments are associated with fewer visits.

Other factors also could be at play, the research team noted. With more parents working, some may find it difficult to bring children in for care. And there may be less need for some visits. Vaccination has dramatically reduced rates of ear infections and hospitalizations. Pediatricians are being more careful with prescribing antibiotics, and this could be causing parents to watch children with cold symptoms for longer before seeking care. Recent research showing that children with ear or urinary tract infections do not always need to come back for rechecks also may have cut down on the number of visits for problems. And parents have ever-increasing amounts of information available to them online as they are deciding whether to seek care.

The drop in visits is not isolated to children. "This decline among children is echoed in other studies among younger and older adults," added senior author Ateev Mehrotra, M.D., M.P.H., associate professor of health care policy at Harvard Medical School. "Due to a variety of forces, Americans are not as connected with their primary care providers."

Credit: 
University of Pittsburgh

Opioid prescriptions affected by computer settings

Simple, no-cost computer changes can affect the number of opioid pills prescribed to patients, according to a new UC San Francisco study.

Researchers found that when default settings, showing a preset number of opioid pills, were modified downward, physicians prescribed fewer pills. Fewer pills could improve prescription practices and protect patients from developing opioid addictions.

The study publishes Tuesday, January 21, 2020 in JAMA Internal Medicine.

"It's striking that even in the current environment, where doctors know about the risks from opioids and are generally thoughtful about prescribing them, this intervention affected prescribing behavior," said senior author Maria C. Raven, MD, MPH, chief of emergency medicine at UCSF and vice chair of the UCSF Department of Emergency Medicine. "The findings are really exciting because of their potential to impact patient care at a large level. Reducing the quantities of opioids prescribed may help protect patients from developing opioid use disorder."

Prescription opioids play a significant role in the ongoing national public health crisis that has taken a massive toll on many communities.

Some addictions stem from an initial opioid prescription for acute pain in individuals who never previously took the pain medications, adding to the overall tragedy. As a result, emergency departments, hospitals and government policymakers have worked to decrease opioid prescribing through provider education and published guidelines, with mixed success.

In an effort to determine whether default settings could influence quantities prescribed, investigators in the new study examined opioid prescribing at two emergency departments, UCSF Medical Center and Highland Hospital, a trauma center and safety-net teaching hospital in Oakland, between November 2016 and July 2017.

Over the course of 20 weeks, the researchers randomly changed the default settings on electronic medical records for commonly prescribed opioids such as oxycodone, Percocet, and Norco, for four weeks at a time. Before the study, the electronic medical records had defaults for pain medications of 12 pills at Highland and 20 pills at UCSF. The researchers used preset quantities of 5, 10 and 15 pills, and also tested a blank setting that forced physicians to enter a number. Physicians could increase or decrease the number to whichever they felt was most appropriate for each patient. Altogether, 4,320 opioid prescriptions were analyzed.

The researchers found that changing default quantities affected the number of pills prescribed. Lower defaults were associated with lower quantities of opioids prescribed and a lower proportion of prescriptions that exceed prescribing recommendations from the federal Centers for Disease Control and Prevention.

The authors noted that they considered the risk of patient harm to be "very low," and that the risk of overprescribing was far greater than the risk of under-prescribing.

"Every electronic health record throughout the country already has default settings for opioids," said lead author Juan Carlos Montoy, MD, PhD, assistant professor of emergency medicine at UCSF. "What we've shown is that default settings matter, and can be changed to improve opioid prescribing. Importantly, this is cost free and preserves physician autonomy to do what they think is best for each patient."

"Our findings add to a large body of research from behavioral economics that has shown that defaults can be used to change behavior," Montoy said. "The opioid epidemic is complex and this certainly won't fix it, but it is one more tool we can use to address it."

Credit: 
University of California - San Francisco

Montana State astrophysicist finds massive black holes wandering around dwarf galaxies

image: A new search led by Montana State University astrophysicist Amy Reines has revealed more than a dozen massive black holes in dwarf galaxies that were previously considered too small to host them, surprising scientists with their location within the galaxies.

Image: 
MSU Photo by Adrian Sanchez-Gonzalez

BOZEMAN -- A new search led by Montana State University has revealed more than a dozen massive black holes in dwarf galaxies that were previously considered too small to host them, and surprised scientists with their location within the galaxies.

The study, headed by MSU astrophysicist Amy Reines, searched 111 dwarf galaxies within a billion light years of Earth using the National Science Foundation's Karl G. Jansky Very Large Array at the National Radio Astronomy Observatory, two hours outside Albuquerque in the plains of New Mexico. Reines identified 13 galaxies that "almost certainly" host massive black holes and found something unexpected: The majority of the black holes were not in the location she anticipated.

"All of the black holes I had found before were in the centers of galaxies," said Reines, an assistant professor in the Department of Physics in the College of Letters and Science and a researcher in MSU's eXtreme Gravity Institute. "These were roaming around the outskirts. I was blown away when I saw this."

The eXtreme Gravity Institute brings together physicists and astronomers to study phenomena where the forces of gravity are so strong they blur the separation between space and time, such as the big bang, neutron stars and black holes.

There are two main types of black holes, incredibly dense areas of space with gravitational pulls strong enough to capture light. Smaller, stellar black holes form as large stars die and are roughly 10 times the mass of our sun, according to Reines. The other type, known as supermassive or massive black holes, tend to be found at the center of galaxies and can have masses millions or even billions that of our sun. Scientists don't know how they are created.

The Milky Way, a spiral galaxy consisting of somewhere between 100 and 400 billion stars, has a massive black hole at its center, Sagittarius A*. Dwarf galaxies can be of any shape, but are much smaller than the Milky Way, with up to a few billion stars.

Reines' results confirm predictions from recent computer simulations by Jillian Bellovary, an assistant professor at Queensborough Community College in New York and Research Associate at the American Museum of Natural History, which postulated that black holes may often be off-center in dwarf galaxies due to the way galaxies interact as they move through space. The findings may change how scientists look for black holes in dwarf galaxies in the future.

"We need to expand searches to target the whole galaxy, not just the nuclei where we previously expected black holes to be," Reines said.

Reines' paper, "A New Sample of (Wandering) Massive Black Holes in Dwarf Galaxies from High Resolution Radio Observations," was published Jan. 3 in The Astrophysical Journal, and Reines reported the findings at the American Astronomical Society meeting in Honolulu, Hawaii, on Jan. 5.

Reines has been searching the skies for black holes for a decade. As a graduate student at the University of Virginia, she focused on star formation in dwarf galaxies, but in her research she found something else that captured her interest: a massive black hole "in a little dwarf galaxy where it wasn't supposed to be."

Thirty million light years from Earth, the dwarf galaxy Henize 2-10 was previously believed to be too small to host a massive black hole. Conventional wisdom told us that all massive galaxies with a spheroidal component have a massive black hole, Reines explained, and little dwarf galaxies didn't. Yet Reines found one in the center of the dwarf galaxy. It was a "eureka" moment, she said. Her findings were published in the journal Nature in 2011 and Reines turned her research to searching for other black holes in dwarf galaxies.

"Once I started looking for these things on purpose, I started finding a whole bunch," Reines said.

Her next search of the universe shifted to visual data rather than radio signals. It uncovered over 100 possible black holes in the first systematic search of a parent sample of more than 40,000 dwarf galaxies. For her latest search, described in the paper released this month, Reines wanted to go back and look for radio signatures in that sample, which she said would allow her to find massive black holes in star-forming dwarf galaxies. Only one galaxy was identified using both methods.

"There are lots of opportunities to make new discoveries because studying black holes in dwarf galaxies is a new field," she said. "People are definitely captivated by black holes. They're mysterious and fascinating objects."

Reines' discoveries have poured new energy into the search for black holes in dwarf galaxies, opening up new areas of astrophysics as she and other scientists attempt to discover how these massive black holes form.

"When new discoveries break our current understanding of the way things work, we find even more questions than we had before," said Yves Idzerda, head of the Department of Physics at MSU.

Credit: 
Montana State University

Research supports new approach to mine reclamation

image: Land reclaimed using geomorphic techniques blends in with undisturbed terrain in the Gas Hills of Fremont County in central Wyoming.

Image: 
Wyoming DEQ

A new approach to reclaiming lands disturbed by surface mining is having the desired result of improving ecosystem diversity, including restoration of foundation species such as sagebrush, according to a study by University of Wyoming researchers.

The study by Associate Professor Kristina Hufford and graduate student Kurt Fleisher, in the Department of Ecosystem Science and Management, looked at former uranium and coal mine sites in central and southwest Wyoming reclaimed under the Wyoming Department of Environmental Quality's (DEQ) Abandoned Mine Land (AML) program. The research was published recently in the Journal of Environmental Management.

"We found that the areas reclaimed using the new techniques, called geomorphic reclamation, had greater species diversity and improved plant community structure when compared with areas reclaimed using traditional practices," Hufford says. "There is strong evidence that geomorphic reclamation may be a better candidate than traditional reclamation to restore foundation species such as sagebrush in Wyoming."

Traditional reclamation techniques generally have created landscapes with uniform topography and linear slopes, sometimes resulting in problems with erosion, as well as less-than-desired revegetation. Geomorphic reclamation is a relatively novel approach intended to mimic the topography of nearby undisturbed lands, with a wide variety of terrain that is stable and less susceptible to erosion.

DEQ's AML Division used both traditional and geomorphic techniques in reclaiming a former uranium mine in the Gas Hills of Fremont County and a former coal mine north of Rock Springs in Sweetwater County. The seeding of those sites was completed in 2007 and 2009, respectively. With funding from DEQ, the UW scientists examined the sites in the summers of 2017 and 2018 to compare plant growth.

While geomorphic techniques didn't result in landscapes exactly matching undisturbed rangeland at either site, the researchers found that geomorphic reclamation was more successful than traditional reclamation from several perspectives.

Most significantly, there was more plant diversity and species richness, including larger numbers of shrubs such as sagebrush and rabbitbrush. These native species are of particular importance to sage grouse, pronghorn and other wildlife species.

"The results of geomorphic reclamation for shrub recovery may have benefits for species that depend upon sagebrush," Hufford says.

The researchers did find that geomorphic reclamation was more successful at the Gas Hills site than the site north of Rock Springs. They say that could be a result of climate differences between the two locations; the fact that the Gas Hills seeding took place two years earlier; and the fact that more native topsoil was used at the Gas Hills site.

They also suggest that seed mixtures could be adjusted to include more native plant species and come closer to matching vegetation on surrounding undisturbed rangelands.

Still, the researchers wrote, "Our results suggest geomorphic reclamation may improve plant community diversity and wildlife habitat as a practical method for landscape-level restoration in post-mining sites."

The issue has particular relevance for Wyoming, where nearly 90,000 acres have been disturbed by surface mining and many more have been permitted for future mining.

Credit: 
University of Wyoming

Scanning Raman picoscopy: A new methodology for determining molecular chemical structure

image: (a) Schematic of scanning Raman picoscopy (SRP). When a laser beam is focused into the nanocavity between the atomistically sharp tip and substrate, a very strong and highly localized plasmonic field will be generated, dramatically enhancing the Raman scattering signals from the local chemical groups in a single molecule right underneath the tip. (b) Merged SRP image by overlaying four typical Raman imaging patterns shown on the right insets for four different vibrational modes. (c) Artistic view of the Mg-porphine molecule showing how four kinds of chemical groups (colored "Legos") are assembled into a complete molecular structure.

Image: 
©Science China Press

Precise determination of the chemical structure of a molecule is of vital importance to any molecular related field and is the key to a deep understanding of its chemical, physical, and biological functions. Scanning tunneling microscope and atomic force microscope have outstanding abilities to image molecular skeletons in real space, but these techniques usually lack chemical information necessary to accurately determine molecular structures. Raman scattering spectra contain abundant structural information of molecular vibrations. Different molecules and chemical groups exhibit distinct spectral features in Raman spectra, which can be used as the "fingerprints" of molecules and chemical groups. Therefore, the above mentioned deficiency can in principle be overcome by a combination of scanning probe microscopy with Raman spectroscopy, as demonstrated by the tip enhanced Raman spectroscopy (TERS), which opens up opportunities to determine the chemical structure of a single molecule.

In 2013, a research group led by Zhenchao Dong and Jianguo Hou at University of Science and Technology of China (USTC) demonstrated sub-nanometer resolved single-molecule Raman mapping for the first time [Nature 498, 82 (2013)], driving the spatial resolution with chemical identification capability down to ~5 Å. Since then, researchers around the world have been keeping on developing such single-molecule Raman imaging technique to explore what is the ultimate limit of the spatial resolution and how this technique can be best utilized.

Recently, the USTC group published a research paper in National Science Review (NSR) entitled "Visually Constructing the Chemical Structure of a Single Molecule by Scanning Raman Picoscopy", pushing the spatial resolution to a new limit and proposing an important new application for the state-of-art technique. In this work, by developing cryogenic ultrahigh-vacuum TERS system at liquid-helium temperatures and fine-tuning the highly localized plasmon field at the sharp tip apex, they further drive the spatial resolution down to 1.5 Å on the single-chemical-bond level, which enables them to achieve full spatial mapping of various intrinsic vibrational modes of a molecule and discover distinctive interference effects in symmetric and antisymmetric vibrational modes. More importantly, based on the Ångström-level resolution achieved and the new physical effect discovered, by combining with Raman fingerprint database of chemical groups, they further propose a new methodology, coined as Scanning Raman Picoscopy (SRP), to visually construct the chemical structure of a single molecule. This methodology highlights the remarkable ability of Raman-based scanning technology via an atomistically sharp tip to reveal the molecular chemical structure in real space, just by "looking" at a single molecule optically, as schematically shown in Figure (a).

By applying the SRP methodology to a single magnesium porphyrin model molecule, the researchers at USTC obtained a set of real-space imaging patterns for different Raman peaks, and found that these patterns show different spatial distributions for different vibrational modes. Taking the typical C-H bond stretching vibration on the pyrrole ring as an example, for the antisymmetric vibration (3072 cm-1) of two C-H bonds, the phase relation of their local polarization responses is opposite. When the tip is located right above the center between two bonds, the contributions from both bonds to the Raman signals cancel out, giving rise to the "eight-spot" feature in the Raman map for the whole molecule, with the best spatial resolution down to 1.5 Å. These "eight spots" have good spatial correspondence with the eight C-H bonds on the four pyrrole rings of a magnesium porphyrin molecule, indicating that the detection sensitivity and spatial resolution have reached the single-chemical-bond level. Raman imaging patterns of other vibrational peaks also show good correspondence to relevant chemical groups in terms of characteristic peak positions and spatial distributions [as shown in Figures (b) and (c)]. The correspondence provided by the simultaneous spatially and energy-resolved Raman imaging allows them to correlate local vibrations with constituent chemical groups and to visually assemble various chemical groups in a "Lego-like" manner into a whole molecule, thus realizing the construction of the chemical structure of a molecule.

The scanning Raman picoscopy (SRP) is the first optical microscopy that has the ability to visualize the vibrational modes of a molecule and to directly construct the structure of a molecule in real space. The protocol established in this proof-of-principle demonstration can be generalized to identify other molecular systems, and can become a more powerful tool with the aid of imaging recognition and machine learning techniques. The ability of such Ångström-resolved scanning Raman picoscopy to determine the chemical structure of unknown molecules will undoubtedly arouse extensive interests of researchers in the fields of chemistry, physics, materials, biology and so on, and is expected to stimulate active research in the fields as SRP is developed into a mature and universal technology.

Credit: 
Science China Press

Mars' water was mineral-rich and salty

image: NASA's Curiosity rover has obtained the mineralogical and chemical data of ancient lake deposits at Gale Crater, Mars. The present study reconstructs water chemistry of the paleolake in Gale based on the Curiosity's data.

Image: 
NASA

Presently, Earth is the only known location where life exists in the Universe. This year the Nobel Prize in physics was awarded to three astronomers who proved, almost 20 years ago, that planets are common around stars beyond the solar system. Life comes in various forms, from cell-phone-toting organisms like humans to the ubiquitous micro-organisms that inhabit almost every square inch of the planet Earth, affecting almost everything that happens on it. It will likely be some time before it is possible to measure or detect life beyond the solar system, but the solar system offers a host of sites that might get a handle on how hard it is for life to start.

Mars is at the top of this list for two reasons. First, it is relatively close to Earth compared to the moons of Saturn and Jupiter (which are also considered good candidates for discovering life beyond Earth in the solar system, and are targeted for exploration in the coming decade). Second, Mars is extremely observable because it lacks a thick atmosphere like Venus, and so far, there are pretty good evidence that Mars' surface temperature and pressure hovers around the point liquid water--considered essential for life--can exist. Further, there is good evidence in the form of observable river deltas, and more recent measurements made on Mars' surface, that liquid water did in fact flow on Mars billions of years ago.

Scientists are becoming increasingly convinced that billions of years Mars was habitable. Whether it was in fact inhabited, or is still inhabited, remains hotly debated. To better constrain these questions, scientists are trying to understand the kinds of water chemistry that could have generated the minerals observed on Mars today, which were produced billions of years ago.

Salinity (how much salt was present), pH (a measure of how acidic the water was), and redox state (roughly a measure of the abundance of gases such as hydrogen [H2, which are termed reducing environments] or oxygen [O2, which are termed oxidising environments; the two types are generally mutually incompatible]) are fundamental properties of natural waters. As an example, Earth's modern atmosphere is highly oxygenated (containing large amounts of O2), but one need only dig a few inches into the bottom of a beach or lake today on Earth to find environments which are highly reduced.

Recent remote measurements on Mars suggest its ancient environments may provide clues about Mars' early habitability. Specifically, the properties of pore water within sediments apparently deposited in lakes in Gale Crater on Mars suggest these sediments formed in the presence of liquid water which was of a pH close to that of Earth's modern oceans. Earth's oceans are of course host to myriad forms of life, thus it seems compelling that Mars' early surface environment was a place contemporary Earth life could have lived, but it remains a mystery as to why evidence of life on Mars is so hard to find.

Credit: 
Tokyo Institute of Technology

'Ancient' cellular discovery key to new cancer therapies

image: The findings provide a new opportunity for cancer treatment strategies aimed at suppressing cell proliferation in the nutrient-poor tumour microenvironment, says Flinders Professor Janni Petersen.

Image: 
Photo: Flinders University

Australian researchers have uncovered a metabolic system which could lead to new strategies for therapeutic cancer treatment.

A team at Flinders University led by Professor Janni Petersen and the St Vincent's Institute of Medical Research have found a link between a metabolic system in a yeast, and now mammals, which is critical for the regulation of cell growth and proliferation.

"What is fascinating about this yeast is that it became evolutionarily distinct about 350 million years ago, so you could argue the discovery, that we subsequently confirmed occurs in mammals, is at least as ancient as that," said Associate Professor Jonathon Oakhill, Head, Metabolic Signalling Laboratory at SVI in Melbourne.

This project, outlined in a new paper in Nature Metabolism, looked at two major signalling networks.

Often referred to as the body's fuel gauge; a protein called AMP-Kinase, or AMPK, regulates cellular energy, slowing cell growth down when they don't have enough nutrients or energy to divide.

The other, that of a protein complex called mTORC1/TORC1, which also regulates cell growth, increases cell proliferation when it senses high levels of nutrients such as amino acids, insulin or growth factors.

A hallmark of cancer cells is their ability to over-ride these sensing systems and maintain uncontrolled proliferation.

"We have known for about 15 years that AMPK can 'put the brakes on' mTORC1, preventing cell proliferation" says Associate Professor Oakhill. "However, it was at this point that we discovered a mechanism whereby mTORC1 can reciprocally also inhibit AMPK and keep it in a suppressed state.

Professor Petersen, from the Flinders Centre for Innovation in Cancer in Adelaide, South Australia says the experiments showed the yeast cells "became highly sensitive to nutrient shortages when we disrupted the ability of mTORC1 to inhibit AMPK".

"The cells also divided at a smaller size, indicating disruption of normal cell growth regulation," she says.

"We measured the growth rates of cancerous mammalian cells by starving them of amino acids and energy (by depriving them of glucose) to mimic conditions found in a tumour.

"Surprisingly, we found that these combined stresses actually increased growth rates, which we determined was due to the cells entering a rogue 'survival' mode.

"When in this mode, they feed upon themselves so that even in the absence of appropriate nutrients the cells continue to grow.

"Importantly, this transition to survival mode was lost when we again removed the ability of mTORC1 to inhibit AMPK."

These findings provide a new opportunity for cancer treatment strategies aimed at suppressing cell proliferation in the nutrient-poor tumour microenvironment, the research concludes.

Credit: 
Flinders University

Advancing the application of genomic sequences through 'Kmasker plants'

image: Applications and methods of the bioinformatics tool "Kmasker plants" for the analysis of sequence data.

Image: 
Chris Ulpinnis / IPB Halle & Pixabay

Gatersleben, 20.01.2020 The development of next-generation-sequencing (NGS) has enabled researchers to investigate genomes that might previously have been considered too complex or too expensive. Nevertheless, the analysis of complex plant genomes, which often have an enormous amount of repetitive sequences, is still a challenge. Therefore, bioinformatics researchers from Leibniz Institute of Plant Genetics and Crop Plant Research (IPK), Martin Luther University Halle-Wittenberg (MLU) and Leibniz Institute of Plant Biochemistry (IPB) have now published "Kmasker plants", a program that allows the identification of repetitive sequences and thus facilitates the analysis of plant genomes.

In bioinformatics, the term k-mer is used to describe a nucleotide sequence of a certain length "k". By defining and counting such sequences, researchers can quantify repetitive sequences in the genome they are studying and assign them to corresponding positions. As early as 2014, researchers at IPK in Gatersleben used this approach to develop the in-silico (computer-based) tool "Kmasker". It was used to detect repetitions in the characterisation of the barley genome (Schmutzer et al., 2014).

The use of NGS is becoming more and more important, but the error-free composition of complex genomes from NGS results is still a challenge. For this reason, the researchers recently decided to revive and expand this initial proof-of-concept project. Under the leadership of Dr. Thomas Schmutzer, formerly from the research group "Bioinformatics and Information Technology" at IPK and now affiliated with the MLU, scientists from the MLU, the IPK, Wageningen University & Research and the IPB Halle worked in close cooperation on the redesign and development of "Kmasker plants". This collaboration was largely supported by the two service centres "GCBN" and "CiBi" from the German Network for Bioinformatics Infrastructure "de.NBI".

"Kmasker plants" allows for the rapid and reference-free screening of nucleotide sequences using genome-wide derived k-mers. In extension to the previous version, the bioinformatics tool now also enables comparative studies between different cultivars or closely related species, and supports the identification of sequences suitable as fluorescence in situ hybridisation (FISH) probes or CRISPR/Cas9-specific guide RNAs. Furthermore, "Kmasker plants" has been published with a web service that contains the pre-computed indices for selected economically important crop plants, such as barley or wheat. Dr. Schmutzer emphasises that "this tool will enable plant researchers all over the world to test plant genomes and thus, for example, identify repeat free parts of their sequence of interest." Rather, he believes that the enhanced features will make it possible to detect sequence candidate regions that have multiplied in the genome of one species but are missing in other species or occur in smaller copy numbers. This is a common effect that contributes to phenotypic variation of agronomic importance in various crops. A significant example is the Vrn-H2 gene, which is present in a single copy in winter barley, while it is missing in barley spring lines.

The "Kmasker plants" web-service is now available as part of the IPK Crop Analysis Tool Suite (CATS) and therefore as a service of the de.NBI Service Platform. Alternatively, the "Kmasker plants" source code can directly be accessed and installed via GitHub.

Credit: 
Leibniz Institute of Plant Genetics and Crop Plant Research

Cybercrime: Internet erodes teenage impulse controls

Many teenagers are struggling to control their impulses on the internet, in a scramble for quick thrills and a sense of power online, potentially increasing their risks of becoming cyber criminals.

A new study by Flinders Criminology analysed existing links between legal online activities and cybercrime- for example, how viewing online pornography progresses to opening illegal content, and motivations to evolve from online gaming to hacking.

Newly published in the European Society of Criminology, the authors outline why illegal online activity involving adolescents from 12-19 years of age is encouraged by the idea the internet blurs normal social boundaries amongst young users tempted into wrongdoings they wouldn't contemplate in the outside world.

Flinders Criminologist Professor Andrew Goldsmith says illegal online activity is especially attractive for adolescents already prone to curiosity and sneaky thrill seeking, but the internet encourages new levels experimentation which are easily accessible.

"The internet allows young people to limit their social involvement exclusively to particular associations or networks, as part of a trend we've termed 'digital drift'. From a regulatory perspective, we're finding this poses significant challenges as it degrades young people's impulse controls."

"It's becoming increasingly important to understand the connection between young people's emotional drivers and committing crimes, as well as human-computer interactions to establish why the internet easily tempts young users into digital piracy, pornography and hacking."

"We're using the word seduction to describe the processes and features intrinsic to the online environment that make online activity both attractive and compelling." "For some young people, the Internet is like a seductive swamp, very appealing to enter, but very sticky and difficult to get out of."

Professor Goldsmith says there needs to be a deeper understanding of the influential technologies regularly used by young people, recognizing that not all motivations for transgression indicate a deep criminal pathology or criminal commitment.

"Policy should consist of interventions that take into account the lack of worldly experience amongst many young offenders. Online technologies render the challenge of weighing up potential risks and harms from actions even harder. A propensity for thrill-seeking common especially among young males encouraged by the Internet can create a form of short-sightedness towards consequences."

"Effective government responses must reflect on the range of motivations young people bring to, and find in, their online behaviours, not least of all in order to garner support amongst young people when it comes to effective regulatory changes."

Credit: 
Flinders University

TB bacteria survive in amoebae found in soil

Scientists from the University of Surrey and University of Geneva have discovered that the bacterium which causes bovine TB can survive and grow in small, single-celled organisms found in soil and dung. It is believed that originally the bacterium evolved to survive in these single-celled organisms known as amoebae and in time progressed to infect and cause TB in larger animals such as cattle.

During the study, published in The ISME Journal, scientists sought to understand more about the bacterium Mycobacterium bovis (M. bovis), which causes bovine TB, and how it can survive in different environments. To do this scientists infected a type of amoebae known as Dictyostelium discoideum with M.bovis. Unlike other bacterium which were digested and used as a food source by the amoebae, M.bovis was unharmed and continued to survive for two days. In-depth analysis showed that the bacterium uses the same genes to escape from amoebae that it uses to avoid being killed by immune cells in larger animals such as cattle and humans.

Scientists also discovered that M. bovis remained metabolically active and continued to grow, although at a slower pace, at lower temperatures than expected.

Previously it was thought the bacterium could only replicate at 37?C, the body temperature of cattle and humans; however, replication of the bacterium was identified at 25 ?C. Researchers believe that the bacterium's ability to adapt to ambient temperatures and survive in amoebae may partially explain high transmission rates of the bacterium between animals.

Bovine TB is a hugely underestimated problem worldwide and England has the highest incidence of infection in Europe. Cattle found to have bovine TB are legally required to be slaughtered due to the high risk of the disease entering the food chain and spreading to humans. 32,793 cattle were slaughtered in England in 2018 in a bid to curtail the spread of the disease.

Lead author Professor Graham Stewart, Head of the Department of Microbial Sciences at the University of Surrey, said: "Despite implementation of control measures, bovine TB continues to be a major threat to cattle and has an enormous impact on the rural economy. Understanding the biology behind the TB disease and how it spreads is crucial for a balanced discussion on this devastating problem and to developing preventative measures to stop its spread.

"An important additional benefit is that our research shows the potential for carrying out at least some future TB research in amoebae rather than in large animals."

Credit: 
University of Surrey