Culture

More federal funding needed to increase Americans' active transportation habits

image: A new study led by School of Public and International Affairs faculty member Ralph Buehler used National Household Travel Surveys from 2001 and 2017 to estimate frequency, duration, and distance of walking and cycling per capita. The research suggests the locations that have benefited most from federal spending to increase active transportation are large urban areas like Washington, D.C. (above).

Image: 
Photo by Ryan Young for Virginia Tech.

The federal government has allocated only about 2 percent of its transportation funds to encourage walking and cycling, not nearly enough to make a significant difference, according to Ralph Buehler, associate professor and chair of the urban affairs and planning program in the Virginia Tech School of Public and International Affairs.

In a recently released study of National Household Travel Surveys from 2001 and 2017 designed to estimate the frequency, duration, and distance of walking and cycling per capita, Buehler and his collaborators, John Pucher from Rutgers University and Adrian Bauman from the University of Sydney, found only a slight increase in walking and no increase in cycling.

"It is not at all surprising that the small increase in walking was in dense, urban cities like Boston, New York City, Philadelphia, and Washington, D.C., where measures have been taken to improve infrastructure and programs and policies have been adopted to make walking and cycling easier and safer," said Buehler.

The study showed the most significant increase in walking was among well-educated adults 25-64 years.

While walking rates are roughly the same for men and women, cycling remains highest among white, higher-income males. In fact, three times more men than women are likely to cycle, Buehler said.

"In general, women will only cycle if they think the entire ride will be safe," said Buehler. "If they perceive that there will be any danger at all along the way they will resist."

Another obstacle to increasing walking and cycling in the U.S., said Buehler, is that driving is still very attractive and, aside from urban areas, parking is typically free or inexpensive. The study found that when compared to individuals in households without cars, those in households with at least one car were less than half as likely to cycle in 2001 and only slightly more than a third as likely in 2017. Rates were even lower for households with two cars and lowest for those with three or more cars.

The study was recently published online in the March issue of the Journal of Transport and Health.

"Public health researchers know that walking and cycling have many health benefits, as well as being environmentally, socially, and economically sustainable means of urban travel," said Buehler, who noted that one big advantage of these physical activities is that they can easily be integrated into daily routines.

Walking and cycling are natural, cheaper, and more utilitarian than exercise that requires more structured visits to gyms, fitness centers, or swimming pools, he said.

"Because they have health implications, it is important that there be equity when improving conditions and facilities connected to them. Governments need to work on a much greater, nationwide scale and especially with underserved communities to plan and implement safer traffic lights, crosswalks and bike lanes, and bike parking facilities," said Buehler.

"Too little has been done outside of wealthier areas to support more walking or cycling," Buehler said, citing Ward 7 and Ward 8 in Washington, D.C., as an example.

Buehler and his team found that both walking and cycling decreased most among one demographic: 5 to 15 year-olds.

"When you think about how walking and cycling can favorably impact health and how obesity and related problems are on the rise among U.S. adolescents, this is a troubling finding," Buehler said.

While some of the decline in this age group can be attributed to parents' fear of "stranger danger" and the fact that technology allows kids to connect to each other with no need for face-to-face encounters, Buehler said that schools could play an important role in encouraging cycling by teaching a course much like its driver education programs.

"If both parents and children better understand the health benefits cycling offers and learn the 'rules' of cycling, they may be more willing to accept it as a practical, healthier, and even more enjoyable mode of transportation," Buehler said.

Another important target group, said Buehler, is seniors. "Nonexistent, unconnected, or low-quality sidewalks increase the risk of falls by pedestrians, with particularly serious health consequences for seniors," he said. "But given their greater vulnerability when driving in traffic, it is quite likely that they can be encouraged to walk if ensured of safe, convenient, and pleasant walking facilities."

Credit: 
Virginia Tech

Memory boost with just one look

MALIBU, Calif. January 10, 2019-- HRL Laboratories, LLC, researchers have published results showing that targeted transcranial electrical stimulation during slow-wave sleep can improve metamemories of specific episodes by nearly 20% after only one viewing of the episode, compared to controls. Metamemory describes the sensitivity of whether memories are recalled accurately or not, such as during eyewitness testimony.

Unique patterns of transcranial electrical stimulation can be cued during the sleep phase called slow-wave sleep to boost consolidation of new memories into the brain's permanent long-term memory. Known as spatiotemporal amplitude-modulated patterns or STAMPS, these stimulation patterns can be targeted to affect particular memories. In immersive virtual reality experiments, one-minute episodes were first paired with arbitrary STAMPs once during viewing. With subsequent stimulation during sleep, targeted memories were measurably improved after just one viewing. Before this study, general belief was that targeting individual naturalistic memories would require invasive interventions at the single neuron scale in the hippocampus.

"Our results suggest that, unlike relatively localized brain circuits responsible for regulating mood and movement, episodic memories are processed by a much more widespread network of brain areas," said HRL principal investigator and lead author Praveen Pilly. "We believe our study will pave the way for next-generation transcranial brain-machine interfaces that can boost learning and memory in healthy humans for real-world tasks, such as language attainment or piloting skills. Such a non-invasive approach can also potentially benefit a majority of patients with learning and memory deficits at much lower cost and risk than required for implanting intracranial electrode arrays. It could also be possible to enhance the efficacy of exposure behavioral therapy with immersive virtual reality using STAMP-based tagging and cueing for the treatment of PTSD."

Credit: 
HRL Laboratories

Who's liable? The AV or the human driver?

image: Hierarchical Game Structure, illustrating the three-layer hierarchical strategic interactions between the law maker, the AV manufacturer, AVs, and HVs on roads. Each player has distinct or even conflicting objectives, aiming to select one strategy to optimize his or her objectives.

Image: 
Sharon Di and Xu Chen/Columbia Engineering

New York, NY--January 13, 2020--A recent decision by the National Transpiration Safety Board (NTSB) on the March 2018 Uber crash that killed a pedestrian in Arizona split the blame among Uber, the company's autonomous vehicle (AV), the safety driver in the vehicle, the victim, and the state of Arizona. With the advent of self-driving cars, the NTSB findings raise a number of questions about the uncertainty in today's legal liability system. In an accident involving an AV and a human driver, who is liable? If both are liable, how should the accident loss be apportioned between them?

AVs remove people from the hands-on task of driving and thus pose a complex challenge to today's accident tort law, which primarily punishes humans. Legal experts anticipate that, by programming driving algorithms, self-driving car manufacturers, including car designers, sensor vendors, software developers, car producers, and related parties who contribute to the design, manufacturing, and testing, will have a direct influence on traffic. While these algorithms make manufacturers indispensable actors, with their product's liability potentially playing a critical role, policy makers have not yet devised a quantitative method to assign the loss between the self-driving car and the human driver.

To tackle this problem, researchers at Columbia Engineering and Columbia Law School have developed a joint fault-based liability rule that can be used to regulate both self-driving car manufacturers and human drivers. They propose a game-theoretic model that describes the strategic interactions among the law maker, the self-driving car manufacturer, the self-driving car, and human drivers, and examine how, as the market penetration of AVs increases, the liability rule should evolve.

Their findings are outlined in a new study to be presented on January 14 by Sharon Di, assistant professor of civil engineering and engineering mechanics, and Eric Talley, Isidor and Seville Sulzbacher Professor of Law, at the Transportation Research Board's 99th Annual Meeting in Washington, D.C

While most current studies have focused on designing AVs' driving algorithms in various scenarios to ensure traffic efficiency and safety, they have not explored human drivers' behavioral adaptation to AVs. Di and Talley wondered about the "moral hazard" effect on humans, whether with exposure to more and more traffic encounters with AVs, people might be less inclined to exercise "due care" when faced with AVs on the road and drive in a more risky fashion.

"Human drivers perceive AVs as intelligent agents with the ability to adapt to more aggressive and potentially dangerous human driving behavior," says Di, who is a member of Columbia's Data Science Institute. "We found that human drivers may take advantage of this technology by driving carelessly and taking more risks, because they know that self-driving cars would be designed to drive more conservatively."

The researchers used game theory to model a world with interacting players who try to select their own actions to optimize their own goals. The players--law makers, AV manufacturers, AVs, and human drivers--have different goals in the transportation ecosystem. Law makers want to regulate traffic with improved efficiency and safety, self-driving car manufacturers are profit-driven, and both self-driving cars and human drivers interact on public roads and seek to select the best driving strategies. To capture the complex interaction among all the players, the researchers applied game theory methods to see which strategy each player settles on, so that others will not take advantage of his or her decisions.

The hierarchical game helped the team to understand the human drivers' moral hazard (how much risk drivers might decide to take on), the AV manufacturer's impact on traffic safety, and the law maker's adaptation to the new transportation ecosystem. They tested the game and its algorithm on a set of numerical examples, offering insights into behavioral evolution of AVs and HVs as the AV penetration rate increases and as cost or environment parameters vary.

The team found that an optimally designed liability policy is critical to help prevent human drivers from developing moral hazard and to assist the AV manufacturer with a tradeoff between traffic safety and production costs. Government subsidies to AV manufacturers for the reduction of production costs would greatly encourage manufacturers to produce AVs that outperform human drivers substantially and improve overall traffic safety and efficiency. Moreover, if AV manufacturers are not regulated in terms of AV technology specifications or are not properly subsidized, AV manufacturers tend to be purely profit-oriented and destructive to the overall traffic system.

"The tragic fatality in Arizona involving a self-driving automobile elicited tremendous attention from the public and policy makers about how to draw the lines of legal liability when AVs interact with human drivers, cyclists, and pedestrians," Talley adds. "The emergence of AVs introduces a particularly thorny type of uncertainty into the status quo, and one that feeds back onto AV manufacturing and design. Legal liability for accidents between automobiles and pedestrians typically involves a complex calculus of comparative fault assessments for each of the aforementioned groups. The introduction of an autonomous vehicle can complicate matters further by adding other parties to the mix, such as the manufacturers of hardware and programmers of software. And insurance coverage distorts matters further by including third party stakeholders. We hope our analytical tools will assist AV policy-makers with their regulatory decisions, and in doing so, will help mitigate uncertainty in the existing regulatory environment around AV technologies."

Di and Talley are now looking at multiple AV manufacturers that target different global markets with different technological specifications, making the development of legal rules even more complex.

"We know that human drivers will take more risks and develop moral hazard if they think their road environment has become safer," Di notes. "It's clear that an optimal liability rule design is crucial to improve social welfare and road safety with advanced transportation technologies."

Credit: 
Columbia University School of Engineering and Applied Science

Resale ticket markets benefit sports teams and fans

New research co-authored by Yanwen Wang, an assistant professor in the UBC Sauder School of Business, reveals that the resale ticket market also appeals to sports fans who normally buy season tickets.

Resale ticket markets -- also known as secondary ticket markets -- allow season ticket holders to recoup costs by selling unneeded tickets, as well as creating an alternative supply of tickets that reduce the need for fans to commit to a season's pass.

It turns out this isn't just beneficial for fans - it also boosts team revenues.

"Since sports teams earn significant portions of their revenues from season ticket holders, we wanted to find out how secondary markets actually impact their behaviour," said Wang. "Our research reveals that sports fans are more likely to purchase season tickets when there is a secondary market because they know they can sell them easily. This in turn increases a sport team's revenue by at least seven per cent per year."

Wang says this is a conservative estimate as it does not include incremental revenue sources such as parking, concessions or merchandise sales. She adds that given sports teams have high fixed costs, low marginal costs, and perishable inventory like food served at concessions, the increased revenue could have significant implications for a team's profitability.

For this study, Wang and her team examined a major league baseball team in the U.S. They analyzed 1,924 customers who purchased season ticket packages at least once over a six-year period and tracked their behaviour for 481 games. They looked at the ticket type, price paid, ticket usage and ticket resales to find out whether each ticket was used for attendance, listed, resold or forgone.

Among the researchers' other discoveries was constraints on ticket pricing such as minimum price floors­, which is pre-determining the lowest price in which a ticket could be sold at, have an adverse impact on season ticket sales.

"This is a complex issue because price floors may be motivated by a sports team's desire to protect brand equity," said Wang. "However, the teams must find ways to balance brand maintenance goals against the benefits of providing more value to the team's hardcore fans."

The researchers hope their study will encourage future research such as how data from the resale market can inform ticket pricing decisions, as well the effect on non-sports events like concerts.

Credit: 
University of British Columbia

Redox signaling and new strategies for parasitic weed control

image: Inhibition of haustorium development by redox inhibitor N-ethylmaleimide (F11). A, The 10 μM 2,6-dimethoxybenzoquine (DMBQ) induced Triphysaria seedlings to develop haustorium. B, The 50 μM F11 combined with 10 μM DMBQ inhibited haustorium initiation.

Image: 
Yaxin Wang, Daniel Steele, Maylin Murdock, Seigmund Lai, and John Yoder

Parasitic weeds are among the world's most economically damaging agricultural pests. They use an organ called the haustorium to build connections with host plants and draw nutrients from them.

While the majority of research on parasitic plant biology and control of root parasitic weeds has been heavily focused on seed germination, scientists at UC Davis focused on the development of haustorium and published their findings in Phytopathology.

"We targeted chemicals that may affect signaling pathways during haustorium development and performed small molecule screens in the model parasitic plant Triphysaria to assay potential haustorium inhibitors," said corresponding author John Yoder. "Several novel inhibitors are identified in our article."

Yoder and his colleagues demonstrated that disrupting redox signaling can effectively prevent haustorium initiation and some inhibitor chemicals do not affect root growth, making them a potential growth method against parasitic weeds.

Their research uniquely screens the largest number of chemicals directly on parasitic plants. They also did a thorough search of the chemical and physical properties of the chemicals in the redox library and DMBQ analog library from PubChem and EPA databases. This information is available in the supplemental files included with their article, "Small-Molecule Screens Reveal Novel Haustorium Inhibitors in the Root Parasitic Plant Triphysaria versicolor" published in the November issue of Phytopathology.

"Biology is complicated," Yoder said. "Many structural analogs of DMBQ induce the expression of genes related to haustorium development but do not activate haustorium initiation. Redox regulations are not limited to haustorium development but also involved in most processes in plant growth, development, and stress response."

Additional research on specific control of redox reactions during haustorium development is still needed to make use of discoveries in this article as potential control methods against parasitic weeds.

Credit: 
American Phytopathological Society

What we're learning about the reproductive microbiome

Most research has focused on the oral, skin, and gut microbiomes, but bacteria, viruses, and fungi living within our reproductive systems may also affect sperm quality, fertilization, embryo implantation, and other aspects of conception and reproduction. Yet, according to a review published January 14 in the journal Trends in Ecology & Evolution, little is known about the reproductive microbiome.

What we do know is that there are examples of microbes affecting sexual health and fertility across the animal kingdom, and that these impacts seem to have important consequences for reproductive biology and behavior.

In human men, certain species of bacteria are associated with higher- or lower-quality sperm samples, while higher quantities of bacteria are more prevalent in semen samples from infertile than fertile men.

A study of primates showed that vaginal microbiomes are more diverse in species in which females have more than one sexual partner. Similar findings have been reported in deer mice and in common lizards.

Male mallards with more colorful bills produce semen better able to kill bacteria, leading researchers to speculate that female mallards sometimes choose partners with more colorful bills to reduce the risk of STDs, minimize disruption to their own microbiome, and ensure they receive high quality sperm.

Male bedbugs inseminate a female by piercing her abdomen. Recent work indicates that females, which can die from infections caused by microbes on the male copulatory organ, ramp up their immunological defenses ahead of mating.

In black garden ants, the testes of virgin males appear to favor microbial growth while the sperm-storage organs of virgin females strongly inhibit microbial growth.

Male red junglefowl, a wild ancestor of the domestic chicken, produce more proteins with antimicrobial effects in their ejaculate over successive matings--possibly to better protect dwindling numbers of sperm.

"Reproductive microbiomes can have significant effects on the reproductive function and performance of both males and females," says senior author Tommaso Pizzari, a zoologist at the University of Oxford. "These studies also shed light on the role of the reproductive microbiome in sexual selection, mating system, and sexual conflict."

While research has begun to link alterations in the vaginal microbiome to adverse pregnancy outcomes in humans, it's unclear how the male reproductive microbiome affects fertility and reproductive success, says first author Melissah Rowe (@melissah_rowe), an evolutionary ecologist who studies reproductive biology and behavior at the Netherlands Institute of Ecology.

"This is surprising, because research has shown that bacteria can damage sperm form and function, and that damaged sperm can contribute to pregnancy failure" Rowe says.

Many major questions remain. Rowe and Pizzari are intrigued by how some microbes benefit one sex or species while harming another. For example, Lactobacilllus--associated with a healthy vaginal microbiome in women and high-quality semen in men--seems to negatively affect sperm-swimming speed in chickens. But the authors say that the combination of sequencing advances, genomic resources, and investigations of host sexual behavior will likely lead to more discoveries soon.

Credit: 
Cell Press

Gut bacteria could guard against Parkinson's, study finds

A common bacteria that boosts digestive health can slow - and even reverse - build-up of a protein associated with Parkinson's, new research suggests.

Building on previous research linking brain function to gut bacteria, this study in a Parkinson's model of roundworms, identified a probiotic - or so-called good bacteria - which prevents the formation of toxic clumps that starve the brain of dopamine, a key chemical that coordinates movement. These new findings could pave the way for future studies that gauge how supplements such as probiotics impact the condition.

In the brains of people with Parkinson's, alpha-synuclein protein misfolds and builds up, forming toxic clumps. These clumps are associated with the death of nerve cells responsible for producing dopamine. The loss of these cells causes the motor symptoms associated with Parkinson's, including freezing, tremors and slowness of movement.

The researchers from the Universities of Edinburgh and Dundee used roundworms altered to produce the human version of alpha-synuclein that forms clumps. They fed these worms with different types of over-the-counter probiotics to see if bacteria in them could affect the formation of toxic clumps.

The scientists found that a probiotic called Bacillus subtilis had a remarkable protective effect against the build-up of this protein and also cleared some of the already formed protein clumps. This improved the movement symptoms in the roundworms. The researchers also found that the bacteria was able to prevent the formation of toxic alpha-synuclein clumps by producing chemicals that change how enzymes in cells process specific fats called sphingolipids.

The study by Goya ME, Xue F, et al, published in the journal Cell Reports, was funded by Parkinson's UK, the EMBO and the European Commission. It is the latest in a number of recent studies which have found a link between brain function and the thousands of different kinds of bacteria living in the digestive system, known as the gut microbiome. Other studies into mice have found that the gut microbiome has an impact on the motor symptoms.

Lead researcher, Dr Maria Doitsidou, from the Centre for Discovery Brain Sciences at the University of Edinburgh, said: "The results provide an opportunity to investigate how changing the bacteria that make up our gut microbiome affects Parkinson's. The next steps are to confirm these results in mice, followed by fast-tracked clinical trials since the probiotic we tested is already commercially available."

Dr Beckie Port, Research Manager at Parkinson's UK, said: "Parkinson's is the fastest growing neurological condition in the world. Currently there is no treatment that can slow, reverse or protect someone from its progression but by funding projects like this, we're bringing forward the day when there will be.

"Changes in the microorganisms in the gut are believed to play a role in the initiation of Parkinson's in some cases and are linked to certain symptoms, that's why there is ongoing research into gut health and probiotics.

"The results from this study are exciting as they show a link between bacteria in the gut and the protein at the heart of Parkinson's, alpha synuclein. Studies that identify bacteria that are beneficial in Parkinson's have the potential to not only improve symptoms but could even protect people from developing the condition in the first place."

Credit: 
Parkinson's UK

Opening up DNA to delete disease

image: DNA is obscured by closed chromatin, an innately protective structure made up of proteins that interact with each other and with DNA. New activation associated proteins (AAPs) can be custom-built to bind near and open up chromatin so that DNA is made more accessible to gene-editing molecules like CRISPR.

Image: 
Illustration by Karmella Haynes

WASHINGTON, D.C., January 14, 2020 -- Protein editorial assistants are clearing the way for cut-and-paste DNA editors, like CRISPR, to access previously inaccessible genes of interest. Opening up these areas of the genetic code is critical to improving CRISPR efficiency and moving toward futuristic, genetic-based assaults on disease.

The DNA-binding editorial assistants were devised by a U.S.-based team of bioengineers, who describe their design in APL Bioengineering, from AIP Publishing.

"The innovation in this paper is having another protein co-delivered with the CRISPR DNA editor, moving chromatin packaging out of the way, so CRISPR has greater access to the DNA," said lead author Karmella Haynes, from Arizona State University and Emory University.

DNA doesn't usually sit inside cells as a freely accessible double helix. It's heavily wrapped up in a protective package called chromatin, which controls what genes are activated or silenced by a cell at any moment in time. Unfortunately, this packaging prevents scientists, who are accessing DNA, from correcting disease-causing mutations.

Haynes describes chromatin blocking as "the elephant in the room" in CRISPR discussions, but it wasn't directly proved until 2016, when Haynes' team conducted some clever experiments capturing the effect. Her team is trying to fix the problem by investigating different methods of chromatin disruption.

They used a well-established artificial system, where chromatin packaging can be turned on or off for one gene -- luciferase -- which codes for an easily detected luminous protein. When examining the chromatin-packed state, the team found several editorial assistants, which are called DNA-binding transiently expressed activation-associated proteins (AAPs), disrupted the chromatin and enabled CRISPR to successfully edit the luciferase gene.

"The idea is that if CRISPR needs to bind in the middle of a gene but can't bind close enough to edit the mutation, you could send in our chromatin-opening protein to right outside that hard-to-bind region, rearrange the chromatin, and make the DNA across that gene more accessible for CRISPR to edit the gene," explained Haynes, who is keen for others to use her system to improve CRISPR efficiency. She pointed out the AAPs can be adapted to target different genes, simply by switching the DNA-binding regions.

"It'd be interesting to find out whether one type of AAP is more effective at disrupting chromatin at some genes versus others. Or whether combining proteins together might further enhance CRISPR editing," said Haynes. "I envision there being a whole catalog of CRISPR cofactors that can be used to enhance CRISPR activity."

Credit: 
American Institute of Physics

Impaired driving -- even once the high wears off

image: McLean Hospital Study Finds Marijuana Use Impacts Driving Even When Sober

Image: 
McLean Hospital

Study at a Glance

McLean researchers have discovered that recreational marijuana use affects driving ability even when users are not intoxicated

The study, conducted through driving simulation, concluded that chronic, heavy, recreational marijuana use was associated with worse driving performance in non-intoxicated drivers compared to non-using healthy control participants

Cannabis users had more accidents, drove at higher speeds, and drove through more red lights than non-users

Earlier onset of marijuana use (regular use prior to age 16) was associated with poorer driving performance

Findings may be reflective of increased impulsivity in those who initiate substance use during adolescence; further research will explore this association

A study by McLean Hospital's Mary Kathryn Dahlgren, PhD, Staci Gruber, PhD, and their team from McLean's Cognitive and Clinical Neuroimaging Core and the Marijuana Investigations for Neuroscientific Discovery (MIND) program, has found that recreational cannabis use affects driving ability even when users are not intoxicated by marijuana.

Published in the Drug and Alcohol Dependence journal, the study "Recreational Cannabis Use Impairs Driving Performance in the Absence of Acute Intoxication," finds that in addition to chronic, heavy, recreational cannabis use being associated with poorer driving performance in non-intoxicated individuals compared to non-users, the researchers linked earlier onset of marijuana use (under age 16) to worse performance.

Recreational cannabis use has expanded across the United States in the last several decades and so has public concern about the substance's impact on activities that present safety issues.

While several studies have examined the direct effect of cannabis intoxication on driving, no other studies until now have examined the effects on driving in heavy marijuana users who are not high.

Senior author Gruber, along with Dahlgren, used a customized driving simulator to assess the potential impact of cannabis use on driving performance. At the time of study, marijuana users had not used for at least 12 hours and were not intoxicated.

Overall, heavy marijuana users demonstrated poorer driving performance as compared to non-users. For example, in the simulated driving exercise, marijuana users hit more pedestrians, exceeded the speed limit more often, made fewer stops at red lights, and made more center line crossings.

Gruber, who is among the world's foremost experts in the cognitive effects of marijuana, said the idea that differences can be detected in sober cannabis users may be surprising to the public.

"People who use cannabis don't necessarily assume that they may drive differently, even when they're not high," she said. "We're not suggesting that everyone who uses cannabis will demonstrate impaired driving, but it's interesting that in a sample of non-intoxicated participants, there are still differences in those who use cannabis relative to those who don't."

When researchers divided the marijuana users into groups based on when they started using cannabis, they found that significant driving impairment was detected and completely localized to those who began using marijuana regularly before age 16.

"It didn't surprise us that performance differences on the driving simulator were primarily seen in the early onset group," Dahlgren said. "Research has consistently shown that early substance use, including the use of cannabis, is associated with poorer cognitive performance."

She added, "What was interesting was when we examined impulsivity in our analyses, most of the differences we saw between cannabis users and healthy controls went away, suggesting that impulsivity may play a role in performance differences."

States where marijuana has been legalized have seen growing public concern that more individuals will drive while intoxicated. But since performance issues can occur even in people who aren't high, Gruber said the public needs to rethink the ways it understands impairment.

"There's been a lot of interest in how we can more readily and accurately identify cannabis intoxication at the roadside, but the truth of the matter is that it is critical to assess impairment, regardless of the source or cause," she said. "It's important to be mindful that whether someone is acutely intoxicated, or a heavy recreational cannabis user who's not intoxicated, there may be an impact on driving, but certainly not everyone demonstrates impairment simply as a function of exposure to cannabis. This is especially important to keep in mind given increasing numbers of medical cannabis patients who differ from recreational users with regard to product choice and goal of use."

Credit: 
McLean Hospital

'Cold Neptune' and two temperate super-Earths found orbiting nearby stars

image: Artist's concept of GJ229Ac, the nearest temperate super-Earth to us that is in a system in which the host star has a brown dwarf companion.

Image: 
Illustration is by Robin Dienel, courtesy of the Carnegie Institution for Science.

Washington, DC-- A "cold Neptune" and two potentially habitable worlds are part of a cache of five newly discovered exoplanets and eight exoplanet candidates found orbiting nearby red dwarf stars, which are reported in The Astrophysical Journal Supplement Series by a team led by Carnegie's Fabo Feng and Paul Butler.

The two potentially habitable planets are orbiting GJ180 and GJ229A, which are among the nearest stars to our own Sun, making them prime targets for observations by next-generation space- and land-based telescopes. They are both super-Earths with at least 7.5 and 7.9 times our planet's mass and orbital periods of 106 and 122 days respectively.

The Neptune-mass planet--found orbiting GJ433 at a distance at which surface water is likely to be frozen--is probably the first of its kind that is a realistic candidate for future direct imaging.

"GJ 433 d is the nearest, widest, and coldest Neptune-like planet ever detected," Feng added.

The newfound worlds were discovered using the radial velocity method for finding planets, which takes advantage of the fact that not only does a star's gravity influence the planet orbiting it, but the planet's gravity also affects the star in turn. This creates tiny wobbles in the star's orbit that can be detected using advanced instruments. Due to their lower mass, red dwarfs are the primary class of stars around which terrestrial mass planets can be found using this technique.

Cooler and smaller than our Sun, red dwarfs--also called M dwarfs--are the most common stars in the galaxy and the primary class of stars known to host terrestrial planets. What's more, compared to other types of stars, red dwarfs can host planets at the right temperature to have liquid water on their surfaces on much closer orbits than those found in this so-called "habitable zone" around other types of stars.

"Many planets that orbit red dwarfs in the habitable zone are tidally locked, meaning that the period at which they spin around their axes is the same as the period at which they orbit their host star. This is similar to how our Moon is tidally locked to Earth, meaning that we only ever see one side of it from here. As a result, these exoplanets are a very cold permanent night on one side and very hot permanent day on the other--not good for habitability," explained lead author Feng. "GJ180d is the nearest temperate super-Earth to us that is not tidally locked to its star, which probably boosts its likelihood of being able to host and sustain life."

The other potentially habitable planet, GJ229Ac is the nearest temperate super-Earth to us located in a system in which the host star has a brown dwarf companion. Sometimes called failed stars, brown dwarfs are not able to sustain hydrogen fusion. The brown dwarf in this system, GJ229B, was one of the first brown dwarfs to be imaged. It is not known if they can host exoplanets on their own, but this planetary system is a perfect case study for how exoplanets form and evolve in a star-brown dwarf binary system.

"Our discovery adds to the list of planets that can potentially be directly imaged by the next generation of telescopes," Feng said. "Ultimately, we are working toward the goal of being able to determine if planets orbiting nearby stars host life."

"We eventually want to build a map of all of the planets orbiting the nearest stars to our own Solar System, especially those that are potentially habitable," added Carnegie co-author Jeff Crane.

This research effort--which also included Carnegie's Steve Shectman, John Chambers, Sharon Wang, Johanna Teske, Matías Díaz, and Ian Thompson, as well as Steve Vogt of U.C. Santa Cruz, Hugh Jones of University of Hertfordshire and Jennifer Burt of NASA's Jet Propulsion Laboratory--culled and reanalyzed data from the European Southern Observatory's Ultraviolet and Visual Echelle Spectrograph survey of 33 nearby red dwarf stars, which operated from 2000 to 2007 and was released in 2009.

"We have been led to this result by antique data," joked Butler.

Once targets were discovered in the UVES archives, the researchers used observations from three planet-hunting instruments to increase the precision of the data. The Carnegie Planet Finder Spectrograph (PFS) at our Las Campanas Observatory in Chile, ESO's High Accuracy Radial velocity Planet Searcher (HARPS) at La Silla Observatory, and the High Resolution Echelle Spectrometer (HIRES) at the Keck Observatory were all crucial to this effort.

"Combining the data from multiple telescopes increases the number of observations and the time baseline, and minimizes instrumental biases," Butler explained.

Credit: 
Carnegie Institution for Science

Solving complex problems at the speed of light

Many of the most challenging optimization problems encountered in various disciplines of science and engineering, from biology and drug discovery [1] to routing and scheduling [2] can be reduced to NP-complete problems. Intuitively speaking, NP-complete problems are "hard to solve" because the number of operations that must be performed in order to find the solution grows exponentially with the problem size. The ubiquity of NP-complete problems has led to the development of dedicated hardware (such as optical annealing and quantum annealing machines like "D-Wave") and special algorithms (heuristic algorithms like simulated annealing).

Recently, there has been a growing interest in solving these hard combinatorial problems by designing optical machines. These optical machines consist of a set of optical transformations imparted to an optical signal, so that the optical signal will encode the solution to the problem after some amount of computation. Such machines could benefit from the fundamental advantages of optical hardware integrated into silicon photonics, such as low-loss, parallel processing, optical passivity at low optical powers and robust scalability enabled by the development of fabrication processes by the industry. However, the development of compact and fast photonic hardware with dedicated algorithms which optimally utilize the capability of this hardware, has been lacking.

Today, the path to solving NP-complete problems with integrated photonics is open due to the work of Charles Roques-Carmes, Dr. Yichen Shen, Cristian Zanoci, Mihika Prabhu, Fadi Atieh, Dr. Li Jing, Dr. Tena Dubček, Chenkai Mao, Miles Johnson, Prof. Vladimir ?eperi?, Prof. Dirk Englund, Prof. John Joannopoulos, and Prof. Marin Soljači? from MIT and the Institute for Soldier Nanotechnologies, published in Nature Communications [3]. In this work, the MIT team developed an algorithm dedicated to solving the well-known NP-complete Ising problem with photonics hardware.

Originally proposed to model magnetic systems, the Ising model describes a network of spins that can point only up or down. Each spin's energy depends on its interaction with neighboring spins -- in a ferromagnet, for instance, the positive interaction between nearest neighbors will incentivize each spin to align with its closest neighbors. An Ising machine will tend to find the spin configuration that minimizes the total energy of the spin network. This solution can then be translated into the solution of other optimization problem [4].

Heuristic Ising machines, like the one developed by the MIT team, only yields a candidate solution to the problem (which is, on average, close to the optimal solution). However, algorithms that always find the exact solution to the problem are difficult to apply to large problem sizes, as they would often have to run for hours, if not days, to terminate. Therefore, heuristic algorithms are an alternative to exact algorithms, since they provide fast and cheap solutions to hard problems.

The researchers were guided by their knowledge of fundamental photonics. Professor Marin Soljači? from MIT explains: "Optical computing is a very old field of research. Therefore, we had to identify which recent advances in photonic hardware could make a difference. In other words, we had to identify the value proposition of modern photonics." Graduate student Charles Roques-Carmes adds: "We identified this value proposition to be: (a) performing fast and cheap fixed matrix multiplication and; (b) performing noisy computation, which means that the result of the computation slightly varies from one run to the other, a little bit like flipping a coin. Therefore, these two elements are the building blocks of our work."

While developing this algorithm and benchmarking it on various problems, the researchers discovered a variety of related algorithms that could also be implemented in photonics to find solutions even faster. Postdoctoral associate Dr. Yichen Shen is enthusiastic about the prospect of this work: "The field of enhancing computing capability with integrated photonics is currently booming, and we believe this work can be part of it. Since the algorithm we developed optimally leverages the strengths and weaknesses of photonic hardware, we hope it could find some short-term application." The MIT research team is currently working in collaboration with others towards realizing proof-of-concept experiments and benchmarking their algorithm on photonic hardware, versus other photonic machines and conventional algorithms running on computers.

Credit: 
Massachusetts Institute of Technology, Institute for Soldier Nanotechnologies

Cancer surgery: It depends on experience

Tumours of the colon, so-called colorectal carcinomas, are the second to third most frequent tumours in women and men in Germany. The surgical removal of the tumours is a central component of the therapy.

"Two aspects are important for long-term survival after surgery: firstly, oncologically correct surgery and secondly, the right treatment if complications arise after surgery," says PD Dr. Armin Wiegering, head of the Visceral Oncology Center at the University Hospital of Würzburg in Bavaria, Germany.

There is a clear correlation between the number of operations performed per year in a hospital and the chance of survival. This was shown by Wiegering's research team in a study whose results are published in BJS Open, a journal of the British Society of Surgery.

Mortality rates in small hospitals are twice as high

The results of the study: In hospitals that perform few operations on colorectal carcinomas (an average of six per year), the post-operative mortality rate is twice as high as in hospitals with large case numbers (an average of 50 per year).

This difference is not due to the fact that complications occur more often in smaller hospitals - because, according to Wiegering, this happens about equally often in all hospitals. Rather, the difference is that patients in small hospitals die more often from the complications. "In large hospitals, on the other hand, there is a sufficient infrastructure to save patients in the event of postoperative complications," said the Würzburg physician.

Facts and figures of the study

In Germany, more than half of all patients with colon cancer are currently operated in hospitals that do not meet the minimum case numbers (50 per year) required by the German Cancer Society DKG. With more than 150 cases per year, the University Hospital of Würzburg is one of the hospitals with very high case numbers.

The study included all cases of colorectal carcinomas that were operated in hospitals in Germany between 2012 and 2015. That was a total of 64,349 patients. Across all hospitals, 3.9 percent of the patients died. In small hospitals the rate was 5.3 percent, in large clinics only 2.6 percent.

Studies on further tumour diseases

"This is the first time that we have been able to prove for Germany that there is a clear correlation between the number of patients operated per year and the success of the operation," said Wiegering. His team was surprised at how big the difference is. "We had not expected that the mortality rate in smaller clinics would be twice as high. It is therefore elementary to operate on patients in hospitals whose medical staff has sufficient experience."

Wiegering's team now plans to carry out similar analyses for stomach carcinomas, liver metastases and other tumour diseases.

Credit: 
University of Würzburg

Sugar changes the chemistry of your brain

The idea of food addiction is a very controversial topic among scientists. Researchers from Aarhus University have delved into this topic and examined what happens in the brains of pigs when they drink sugar water. The conclusion is clear: sugar influences brain reward circuitry in ways similar to those observed when addictive drugs are consumed. The results have just been published in the journal Scientific Reports.

Anyone who has desperately searched their kitchen cabinets for a piece of forgotten chocolate knows that the desire for palatable food can be hard to control. But is it really addiction?

"There is no doubt that sugar has several physiological effects, and there are many reasons why it is not healthy. But I have been in doubt of the effects sugar has on our brain and behaviour, I had hoped to be able to kill a myth. " says Michael Winterdahl, Associate Professor at the Department of Clinical Medicine at Aarhus University and one of the main authors of the work.

The publication is based on experiments done using seven pigs receiving two liters of sugar water daily over a 12-day period. To map the consequences of the sugar intake, the researchers imaged the brains of the pigs at the beginning of the experiment, after the first day, and after the 12th day of sugar.

"After just 12 days of sugar intake, we could see major changes in the brain's dopamine and opioid systems. In fact, the opioid system, which is that part of the brain's chemistry that is associated with well-being and pleasure, was already activated after the very first intake," says Winterdahl.

When we experience something meaningful, the brain rewards us with a sense of enjoyment, happiness and well-being. It can happen as a result of natural stimuli, such as sex or socializing, or from learning something new. Both "natural" and "artificial" stimuli, like drugs, activate the brain's reward system, where neurotransmitters like dopamine and opioids are released, Winterdahl explains.

We chase the rush

"If sugar can change the brain's reward system after only twelve days, as we saw in the case of the pigs, you can imagine that natural stimuli such as learning or social interaction are pushed into the background and replaced by sugar and/or other 'artificial' stimuli. We're all looking for the rush from dopamine, and if something gives us a better or bigger kick, then that's what we choose" explains the researcher.

When examining whether a substance like sugar is addictive, one typically studies the effects on the rodent brain. ¨It would, of course, be ideal if the studies could be done in humans themselves, but humans are hard to control and dopamine levels can be modulated by a number of different factors. They are influenced by what we eat, whether we play games on our phones or if we enter a new romantic relationship in the middle of the trial, with potential for great variation in the data. The pig is a good alternative because its brain is more complex than a rodent and gyrated like human and large enough for imaging deep brain structures using human brain scanners. The current study in minipigs introduced a well-controlled set-up with the only variable being the absence or presence of sugar in the diet.

Credit: 
Aarhus University

No need to dig too deep to find gold!

image: This is a sampling of volcanic gases in Vulcano Crater (Aeolian Islands).

Image: 
© UNIGE

Why are some porphyry deposits - formed by magmatic fluids in volcanic arcs - rich in copper while others primarily contain gold? In an attempt to answer this question, a researcher from the University of Geneva (UNIGE) investigated how the metals are accumulated over the time duration of a mineralizing event, looking for a correlation between the amounts of copper and gold extracted from the deposits. Not only did the researcher discover that the depth of the deposits influences the quantity of metals produced but also that over 95% of the gold is lost to the atmosphere through volcanic emissions. In short, the deeper a deposit is, the more copper there will be, while gold-rich deposits are closer to the surface. These findings, which are published in the journal Nature Communications, will provide valuable assistance to companies that mine these metals.

Geological processes produce different kinds of deposits. Porphyry-type deposits are formed underneath volcanoes by an accumulation of magma that releases fluids on cooling and precipitates metals in the form of ore. "Precipitation is the extraction of metals from the magmatic fluid and their fixation in an ore", explains Massimo Chiaradia, a researcher in the Department of Earth Sciences in UNIGE's Science Faculty. These porphyry deposits, which are found mainly around the Pacific Ring of Fire, produce three-quarters of the natural copper and a quarter of the natural gold mined. "A copper deposit can contain from one to 150 million tonnes, while the quantity of gold varies from ten tonnes to 2,500 tonnes per deposit," continues Chiaradia. But will a copper-rich deposit automatically be rich in gold? And how can we tell where the largest deposits are located?

The depth of the deposit is crucial

The Geneva-based geologist used a range of statistical models to analyse two hypotheses: either the magmatic fluids have varying amounts of metal from the outset or the fluids are identical but it is the effectiveness of the precipitation of the metals that influences the quantity of copper and gold. "I quickly saw that the first hypothesis wasn't right, and that the answer lays with precipitation but with differences for gold and copper related to the duration of mineralisation," explains Chiaradia. "The longer the mineralisation time, the richer the deposit will be in copper. And for the mineralisation to be as long as possible, the deposit must be deep - 3 km from the surface - to guarantee a certain degree of insulation and a long magma life."

Chiaradia observed that less than 1% of the gold is captured in the ores in the deep copper-rich deposits. On the other hand, in deposits located at a depth of up to 3 km, the rate climbs to 5%, "which is still very small, because over 95% of gold always escapes into the atmosphere». In fact, although gold escapes extremely easily in volcanic emissions, "it is retained more in shallow deposits where a separation takes place between the liquid and the vapor, which helps its precipitation," points out Chiaradia. "In the deeper deposits, however, liquid and vapor form only a single fluid phase, which precipitates the copper quickly and makes the gold leak into the atmosphere as the fluid rises to the surface."

Gold is found on the surface, while copper is found at depth

Recent studies have shown that the demand for copper is increasing to such a degree that it will outstrip its availability in natural and recyclable reserves within a few decades. This means that new exploration methods are needed to help find new deposits. And for the first time, these results clearly distinguish two types of porphyry deposits and explain the different ways they are formed. The first, which are very deep, promote the mineralisation of the copper over a long period, while the latter, which are closer to the surface, produce more gold. "It's a valuable indication for the mineral exploration industry, which now knows at what depth it will find large deposits of copper, or conversely large gold deposits, irrespective of the volcano," concludes Chiaradia.

Credit: 
Université de Genève

X-rays and gravitational waves will combine to illuminate massive black hole collisions

A new study by a group of researchers at the University of Birmingham has found that collisions of supermassive black holes may be simultaneously observable in both gravitational waves and X-rays at the beginning of the next decade.

The European Space Agency (ESA) has recently announced that its two major space observatories of the 2030s will have their launches timed for simultaneous use. These missions, Athena, the next generation X-ray space telescope and LISA, the first space-based gravitational wave observatory, will be coordinated to begin observing within a year of each other and are likely to have at least four years of overlapping science operations.

According to the new study, published this week in Nature Astronomy, ESA's decision will give astronomers an unprecedented opportunity to produce multi-messenger maps of some of the most violent cosmic events in the Universe, which have not been observed so far and which lie at the heart of long-standing mysteries surrounding the evolution of the Universe.

They include the collision of supermassive black holes in the core of galaxies in the distant universe and the "swallowing up" of stellar compact objects such as neutron stars and black holes by massive black holes harboured in the centres of most galaxies.

The gravitational waves measured by LISA will pinpoint the ripples of space time that the mergers cause while the X-rays observed with Athena reveal the hot and highly energetic physical processes in that environment. Combining these two messengers to observe the same phenomenon in these systems would bring a huge leap in our understanding of how massive black holes and galaxies co-evolve, how massive black holes grow their mass and accrete, and the role of gas around these black holes.

These are some of the big unanswered questions in astrophysics that have puzzled scientists for decades.

Dr Sean McGee, Lecturer in Astrophysics at the University of Birmingham and a member of both the Athena and LISA consortiums, led the study. He said, "The prospect of simultaneous observations of these events is uncharted territory, and could lead to huge advances. This promises to be a revolution in our understanding of supermassive black holes and how they growth within galaxies."

Professor Alberto Vecchio, Director of the Institute for Gravitational Wave Astronomy, University of Birmingham, and a co-author on the study, said: "I have worked on LISA for twenty years and the prospect of combining forces with the most powerful X-ray eyes ever designed to look right at the centre of galaxies promises to make this long haul even more rewarding. It is difficult to predict exactly what we're going to discover: we should just buckle up, because it is going to be quite a ride".

During the life of the missions, there may be as many as 10 mergers of black holes with masses of 100,000 to 10,000,000 times the mass of the sun that have signals strong enough to be observed by both observatories. Although due to our current lack of understanding of the physics occurring during these mergers and how frequently they occur, the observatories could observe many more or many fewer of these events. Indeed, these are questions which will be answered by the observations.

In addition, LISA will detect the early stages of stellar mass black holes mergers which will conclude with the detection in ground based gravitational wave observatories. This early detection will allow Athena to be observing the binary location at the precise moment the merger will occur.

Credit: 
University of Birmingham