Culture

Discovery of long sought tiny explosions on the Sun

image: This image shows one of the telescopes of the MWA which was used to gather the data used for this study. The MWA has 128 such telescopes, referred to as tiles, distributed over about 5 km diameter.

Image: 
Pete Wheeler/ICRAR

The Sun is the brightest object in the sky and has been studied for hundreds of years, but it continues to hide some secrets. We all know that the visible Sun is extremely hot, at temperature of about 5500 degrees. Surprisingly, on top of this sits a layer of gas, called the corona, which is at a temperature of almost 2 million degrees, over 300 times hotter than the surface of the Sun! What heats up the corona to 2 million degrees is one of the most challenging puzzles about the Sun and no one found a satisfactory answer to this until date. One efficient way of extracting this energy from the magnetic fields involves numerous tiny explosions taking place all over the Sun, all the time. Individually these explosions are too weak, but collectively they have sufficient energy to heat the entire corona due to sheer numbers. Many attempts have been made to look for X-rays and ultraviolet light emitted by these explosions and none has been successful. It was concluded that if they exist, these tiny explosions are too weak to be detected by even the best instruments available today. These explosions also expected to give rise to tiny flashes of radio lights, but till now there were no telescopes sensitive enough to detect them. This work reports the first ever detection of these flashes.

A group of scientists working at the National Centre for Radio Astrophysics (NCRA), a part of the Tata Institute of Fundamental Research, has recently discovered tiny flashes of radio light from all over the Sun. They have identified these as the smoking guns for small magnetic explosions. These are the first ever evidence for their existence and can potentially explain the long-standing coronal heating problem. This work was led by Surajit Mondal, under the supervision of Prof. Divya Oberoi, along with Dr. Atul Mohan, formerly at NCRA, and now at the Rosseland Centre for Solar Physics, Norway. In their journey to unravel this mystery, scientists have already figured out that the extra energy heating up the corona must be coming from the solar magnetic fields, but exactly how this happens is still not known.

"What made this breakthrough possible," said Prof. Divya Oberoi, "is the availability of data from a new technology instrument, the Murchison Widefield Array (MWA), and the work which we have been doing for the past few years at NCRA-TIFR to build the techniques and tools to make the most sensitive solar radio images from this data. The very weak radio flashes we have discovered are about 100 times weaker than the weakest bursts reported till now." Surajit Mondal, the lead author of this work said, "What makes this really exciting is that these flashes are present everywhere on the Sun and at all times, including in the regions of weak magnetic fields, the so-called 'quiet Sun' regions." Dr. Atul Mohan added that, "Our preliminary estimates suggest that these tiny magnetic explosions should collectively have enough energy to heat the corona, which is exactly what is needed for solving the coronal heating problem."

Credit: 
Tata Institute of Fundamental Research

Immune cells multiply and diversify in mouse lungs at birth

image: This image shows a macrophage in the developing mouse lung expressing a combination of distinguishing genes, which are highlighted here in red, white and green.

Image: 
Domingo-Gonzalez et al. (CC BY 4.0)

An explosion in the number and types of immune cells in the lungs of newborn mice likely helps them adapt to breathing and protects them from infection, says a new study published today in eLife.

The findings, from Stanford University and Stanford School of Medicine, US, provide detailed information about dramatic shifts in the immune cells in the lungs of mice from just before birth through the first weeks of life. This insight may help scientists learn more about how problems in early development can lead to breathing problems such as asthma later in life.

"At birth, the lung undergoes marked physiological changes as it changes from a fluid-filled, low-oxygen environment to an air-filled, oxygen-rich environment," says co-lead author Racquel Domingo-Gonzalez, who was a postdoctoral researcher at the Department of Pediatrics, Stanford University School of Medicine, when the study was carried out. "How these changes affect immune cell populations during this transition and the ensuing rapid lung growth after birth is unclear."

To learn more, Domingo-Gonzalez and her collaborators used a technique called single-cell transcriptomics to track gene expression in individual immune cells in the lungs of mice just before birth and through the first three weeks of life. This allowed them to create an atlas of all the immune cells in the mouse lung during early life.

The team found that, just before birth, immune cells called macrophages encircle the small blood vessels in the lungs, likely stimulating them to grow. After birth, a large number of many different types of immune cells appear, including those needed for blood-vessel growth, lung development and to fight off infections.

These discoveries may help explain why disruptions to the immune system early in life caused by infections, excessive levels of oxygen, or steroid drugs may lead to life-long lung problems. "Injuries to the immature lung can have profound, life-long consequences since a significant component of lung development occurs during late pregnancy and the first few years of postnatal life," explains co-lead author Fabio Zanini, who was a postdoctoral fellow in Stephen Quake's lab at Stanford University when the study was initiated and has since transitioned to Senior Researcher at UNSW Sydney, Australia.

"Our work lays the foundation for further studies on the diversity of immune cells and their roles during this important window of lung development," adds senior author Cristina Alvira, Associate Professor of Pediatrics at Stanford University School of Medicine. "This could ultimately lead to new therapies to preserve or enhance lung development in infants and young children."

Credit: 
eLife

New evidence on bed bug burden in urban neighborhoods

image: A new study of bed bug infestation in the Chicago area by researchers at UMass Amherst and the University of Illinois reports observing a higher risk of bed bug infestation in poorer, crowded urban areas.

Image: 
Charles Kremenak

AMHERST, Mass. - In the first study to use systematically collected data from multifamily housing inspections to track bed bug infestation, investigators including Christopher Sutherland at the University of Massachusetts Amherst "confirm what has long been suspected for bed bugs, but also for public health issues in general" - infestations are strongly associated with socioeconomic factors, including neighborhood income, eviction rates and crowding.

Writing in People and Nature about their Chicago-area study, biostatistician Sutherland, with biologist Daniel Schneider and urban planner Andrew Greenlee, both of the University of Illinois at Urbana-Champaign, point out that documenting the scale of the bed bug's "dramatic resurgence" as a common household pest and identifying socioeconomic factors that determine infestation risk are challenging, because data usually come from self-reporting, which has potential for bias.

But "unlike previous research, our data come from systematic inspections with known sampling effort and are, therefore, uniquely able to attribute observed reductions to declines in bed bug prevalence rather than trends in reporting," they add.

Sutherland and colleagues say the evidence of higher risk of bed bug infestation in poorer neighborhoods, in areas where evictions are more common and in more crowded neighborhoods "provides important empirical evidence of the disproportionate allocation of public health burdens upon neighborhoods already facing multiple dimensions of disadvantage - for example, poverty, contaminated water and health inequalities."

Sutherland says he was surprised that the patterns were borne out so strongly. "It's discouraging that we still have these extreme polarities in society," he notes. "Differences in socioeconomic factors means that these public health burdens fall on groups that are less able to cope with them than their more affluent neighbors. We shine a light on yet another public health concern that points squarely to who is bearing the burdens."

Schneider, an expert in dispersal ecology - how species move to new habitats and get extinguished - adds, "The map of where people are most at risk for bed bugs looks like the same areas where more kids have asthma, lead in the bloodstream and likely even COVID-19. How cynical we were coming into this determined how surprised we were by the findings."

The authors' analysis uses administrative data on inspections from Chicago's Department of Buildings. From 2006 to 2018, addresses of 21,340 multi-story multiple dwelling residential buildings four stories or higher, and mixed residential/ commercial building three stories or higher, saw a total 56,384 periodic inspections. Of these, 491 resulted in definitive bed bug evidence - a code violation - at the property. These bed bug-positive inspections occurred at 446 unique properties, indicating that some had bed bugs present across multiple inspections, they note.

Using this and other data, the researchers aggregated the number of inspections and violations in each year at the census tract level and derived socioeconomic measures of each tract. From this, they identified four broad socioeconomic categories - residential stability, housing affordability, resident demographics and neighborhood housing characteristics - and nine variables associated with them.

Their analyses showed that, "in addition to significant variation among years, neighborhood-level median household income was the strongest predictor of bed bug prevalence. Eviction rate and crowding had significant, but relatively smaller effects. We did not find evidence that bed bug prevalence was influenced by mobility rate, percent of renter households, or the percent population with a graduate degree."

Schneider says, "This is just one facet of a larger problem. This is not just a bed bug problem, and if you stack public health issues on top of each other we believe these will correlate strongly." The work appears in an open-access journal, Sutherland says, "so anyone can access the data. We tried hard to make the language clear enough for policymakers, to show that this is more evidence of serious public health disparity."

This study grew out of a two-year, interdisciplinary workshop the authors organized for the National Science Foundation's National Socio-Environmental Synthesis Center (SESYNC) to study bed bug history, sociology, ecology, entomology, urban planning and epidemiology. The research combined existing environment and social data, melding "ideas that existed but were not synthesized together before," in Schneider's words.

Credit: 
University of Massachusetts Amherst

Insurers should be willing to negotiate coronavirus claims to avoid courts being overwhelmed, study

Insurers should be open to negotiating coronavirus claims to avoid courts becoming overwhelmed with disputes, a new study warns.

Attempts by insurers to avoid paying out to those affected by the pandemic may lead to cases going straight into the law courts, when it would be better for both sides to try to negotiate extra-judicially, using alternative dispute resolution methods such as mediation and arbitration, according to the research.

Dr Kyriaki Noussia, from the University of Exeter Law School, is analysing how "force majeure" clauses - often found in contracts and allowing the non-performance of the contractual obligations due to an extraordinary event - are being used in the insurance industry.

Her study, published in the Journal of International Banking Law & Regulation, warns many insurers will reject satisfying coronavirus claims due to force majeure, meaning cases will end up in court. Many policyholders will find that their business interruption insurance policies do not cover the impact of coronavirus, due to certain exclusions in the risks covered or in the way that force majeure is interpreted by insurers.

Dr Noussia said: "Clearly many contracts can't continue as usual, and people are looking for compensation, but many insurance companies don't want to assert the existence of force majeure for business interruption claims due to coronavirus.

"People will have to closely examine the wording of their policies to see how damage is defined in order to demonstrate they should be compensated by their insurer. Some policies will have exclusions and others will not, so people should read their policy carefully before submitting any claim and seek advice.

"It is likely the courts will be flooded by insurance claims, leading to a backlog in cases being heard. The right approach would be for the insurer not to just immediately reject claims, but to try to negotiate a sum to be awarded as compensation if possible. Rather than saying no and passing the problem to the courts it is better if insurers and claimants work together to find a solution.

"This may include negotiating a premium for new contracts, readjusting existing premiums for contracts close to their renewal or using alternative dispute resolution methods to satisfy a claim."

The study also warns those looking to renew their insurance will have to pay more for some policies, particularly if they want to be covered against the impact of coronavirus in the future. Insurance companies may refuse to provide coverage against coronavirus.

It is likely that the impact of coronavirus will also affect commercial contracts, with one side wanting to claim damages and the other wanting to trigger default force majeure clauses to either temporarily suspend obligations about performance in their contracts, and to protect themselves against failures to perform what is stipulated in the document, or to maintain that they are excused from performance altogether.

Force majeure will also be used by people wanting to amend or cancel travel plans, and those about to travel this year may need to amend the scope of their travel insurance cover, or be prepared to travel without such coverage being included as it is not anymore a "fortuity" but a "known event" and hence not covered.

Some insurance contracts have detailed wording about what is covered by "force majeure", normally acts of God such as earthquakes and volcanic eruptions, floods or cyclones, war, strikes and abnormally bad weather, as well as some government actions. The difficulty in current business interruption claims is proving the casual link between the government closure measures and the occurrence of business interruption.

To be able to demonstrate the existence of a claim for business interruption, there has to be direct physical loss or physical damage to the property and the cause of the business interruption damages businesses are seeking has to be direct physical loss or damage affecting their business operation and turnover. To be able to calculate the losses incurred in a claim for business interruption, companies will need to demonstrate previous turnovers, as well as budgets and revenue forecasts for 2020 and subsequent years. They may also have to show additional costs related with business interruption and consequential losses includes bringing in additional temporary workers or third-party contractors, claims preparation costs, contractual penalties, or public relations costs. Insurers may argue at times of disasters very few customers or clients would have patronized the business anyway.

Policyholders often can't negotiate the coverage of business interruption policies, as such policies are by nature standard form policies, called adhesion policies not allowing room for negotiation. The study recommends if wording is ambiguous decisions should be made in favour of the policyholders and against insurers. To avoid decisions constantly favouring policyholders, insurers should use the prior three years of the policyholder's historical financial revenue and cost data to value losses.

US courts have, so far, predominantly seemed reluctant to rule in favor of the existence and provision of insurance coverage for Covid-19 related claims; however a different approach was taken by a French court in late May 2020, where it was decided that an insurer has to pay a restaurant owner two months' worth of coronavirus-related revenue losses. Although the insurer said it would appeal, the ruling will certainly be watched closely.

Credit: 
University of Exeter

Study casts doubt on usefulness of Ofsted ratings

The usefulness of Ofsted ratings as guides for parents and students in choosing a secondary school has been called into question by the findings of a new study.

The study, led by the University of York, suggests that Ofsted ratings of secondary school quality account for less than one percent of the differences in students' educational achievement at age 16.

For example, if one student attending a school rated "good" achieves an A at GCSE and another student from a school that "requires improvement" gets a B - the study reveals that only one tenth of the difference in their grades can be attributed to the school rating.

The researchers also found that Ofsted ratings had almost no bearing on student wellbeing or enjoyment of school life, with students attending schools with the worst Ofsted ratings reporting similar levels of happiness, bullying, future aspirations, satisfaction with school, and ambition as those students attending schools with the highest Ofsted ratings.

Lead author of the study, Professor Sophie von Stumm, from the Department of Education at the University of York, said: "We have found that the factors parents care about most when selecting a school - their child's educational achievement and wellbeing - are negligibly predicted by Ofsted ratings.

"If Ofsted ratings don't predict students' achievement and wellbeing, we need to reconsider just how helpful they are in general. Ofsted Inspections are extremely stressful for teachers - causing problems for recruitment and retention in the profession, and they are also very costly to the taxpayer, with the bill per visit coming in at around £7,000 per school on average.

"Parents often go to great lengths to secure a place at an 'outstanding' school for their children - either by moving house or commuting long distances. Our research suggests these investments don't really achieve what they are aimed at - good grades and well-being for children. So parents should ask themselves: is an outstanding school really worth spending an hour commuting each day rather than using the time to play or read?"

The study looked at data from just under 4,400 pupils in England. The data included information on family background, academic grades at age 11 and 16 and the results of questionnaires investigating levels of wellbeing and school engagement.

Roughly in line with national averages, 27% of the young people in the study attended an "Outstanding" school, 47% attended a "Good" school, 22% attended a "Requires Improvement" school, and four percent attended a school rated as "Inadequate".

While the study results initially indicated that Ofsted's overall quality rating of schools accounted for four percent of the differences in educational achievement at age 16, most of this association could be attributed to family socioeconomic status and prior achievement in primary school.

Once the researchers isolated the unique effects of school quality on student outcomes, Ofsted ratings of school quality predicted less than one percent of the differences in GCSE examination grades.

Professor von Stumm added: "This finding suggests that even the small benefits of school quality for students' individual outcomes can be largely attributed to schools' selection of student intake, not to their influence on academic progress or 'added value'.

"Due to high demand for places, schools rated 'outstanding' can be more selective about the pupils they enrol. They are also often situated in more affluent neighbourhoods with families of higher socioeconomic status. We know from previous research that children's early years school performance and family background are two of the strongest predictors of their later educational achievement."

School quality ratings are weak predictors of students' achievement and wellbeing is published in the Journal of Child Psychology and Psychiatry (JCPP)

Credit: 
University of York

Understanding the role of cardiorespiratory fitness and body composition in brain health

image: Ryan Larsen is interested in understanding how fitness interventions can influence brain health.

Image: 
Della Perrone for the Beckman Institute for Advanced Science and Technology

A new study led by researchers at the Beckman Institute for Advanced Science and Technology examined
how cardiorespiratory fitness and body composition relate to neuronal health in 290 healthy young adults. 

The study “Body mass and cardiorespiratory fitness are associated with altered brain metabolism” was published in Metabolic Brain Disease

The study contributes to a growing body of research suggesting that fitness has beneficial effects for brain health. The study applied magnetic resonance spectroscopy to detect and measure brain metabolites, focusing specifically on N-acetyl aspartic
acid.

“NAA is produced in the neurons and is an important biochemical marker of energy production and neuronal health” said Aron Barbey, a University of Illinois psychology professor, who led the research with senior research scientist Ryan Larsen. “Our prior work demonstrates that neuronal health, as measured by NAA, has favorable associations
with cognitive performance. We were interested in exploring whether modifiable life style factors, such as physical activity and aerobic fitness, are also linked to NAA.”

The researchers showed that a lower percentage of body fat is associated with higher NAA in the white matter, and that this relationship largely accounts for the association between NAA and cardiorespiratory fitness.

“Our findings suggest that fitter adults benefit from improved structural brain connectivity,” Larsen said. “A central question raised by this work is whether we can modify NAA through physical activity and fitness interventions,
providing an effective method to enhance cognitive performance and brain health across the lifespan.”

The research team also included U of I psychology professor Neal Cohen; Northeastern University psychology professors Charles Hillman and Arthur Kramer; and Northeastern University postdoctoral fellow Lauren Raine.

The Office of the Director of National Intelligence, Intelligence Advanced Research Projects supported this research.

 

The study “Body mass and cardiorespiratory fitness are associated with altered brain metabolism” can be found at 10.1007/s11011-020-00560-z.

Journal

Metabolic Brain Disease

DOI

10.1007/s11011-020-00560-z

Credit: 
Beckman Institute for Advanced Science and Technology

App promises to improve pain management in dementia patients

University of Alberta computing scientists are developing an app to aid health-care staff to assess and manage pain in patients suffering from dementia and other neurodegenerative diseases.

"The challenge with understanding pain in patients with dementia is that the expressions of pain in these individuals are often mistaken for psychiatric problems," said Eleni Stroulia, professor in the Department of Computing Science and co-lead on the project. "So we asked, how can we use technology to better understand the pain of people with dementia?"

Along with Stroulia, the project is led by Thomas Hadjistavropoulos at the University of Regina as part of AGE-WELL, one of Canada's Networks of Centres of Excellence.

The app will serve to digitize a pen-and-paper observational checklist that past research has shown helps health-care workers such as nurses when assessing pain in their patients suffering from dementia.

"Our work is to develop an application for nurses to use as well as a back-end repository that stores and manages this data safely," explained Stroulia, who co-leads an AGE-WELL research theme on Technology for Maintaining Good Mental and Cognitive Health. "This new research demonstrates the promising results from our initial trial."

The researchers are now working to build an app that can be adopted more widely. On a micro scale, the app will allow health-care workers to see how pain and pain management techniques are working, or not working, on an individual level, informing patient-care decision making. On a macro scale, widespread use of the tool may have the capacity to improve the quality and efficacy of care that patients with dementia receive.

"When we have this kind of data, we can build models to understand the impact of different interventions," said Stroulia. "This is what can change policy and care in the long-term--evidence-based policy that changes the state of how we practice medicine."

Credit: 
University of Alberta

Citizen scientists spot closest young brown dwarf disk yet

Brown dwarfs are the middle child of astronomy, too big to be a planet yet not big enough to be a star. Like their stellar siblings, these objects form from the gravitational collapse of gas and dust. But rather than condensing into a star's fiery hot nuclear core, brown dwarfs find a more zen-like equilibrium, somehow reaching a stable, milder state compared to fusion-powered stars.

Brown dwarfs are considered to be the missing link between the most massive gas giant planets and the smallest stars, and because they glow relatively dimly they have been difficult to spot in the night sky. Like stars, some brown dwarfs can retain the disk of swirling gas and dust left over from their initial formation. This material can collide and accumulate to form planets, though it's unclear exactly what kind of planets brown dwarfs can generate.

Now researchers at MIT, the University of Oklahoma, and elsewhere, with the help of citizen scientists, have identified the closest young brown dwarf with the kind of disk that could potentially form planets. The brown dwarf, named W1200-7845, is a mere 3.7 million years old and sits at a nearby 102 parsecs, or about 332 light years from Earth.

At this proximity, scientists may be able to zoom in on the young system with future high-powered telescopes, to examine the earliest conditions of a brown dwarf's disk and perhaps learn more about the kind of planets brown dwarfs might support.

The new system was discovered through Disk Detective, a crowdsourced project funded by NASA and hosted by Zooniverse that provides images of objects in space for the public to classify, with the aim of picking out objects that are likely stars with disks that could potentially host planets.

The researchers are presenting their findings, as well as announcing a new version of the Disk Detective website, this week at the all-virtual meeting of the American Astronomical Society.

"Within our solar neighborhood"

Users of Diskdetective.org, which first launched in 2014, can look through "flipbooks" -- images of the same object in space, taken by NASA's Wide-field Infrared Survey Explorer, or WISE, which detects infrared emissions such as thermal radiation given off by the gas and dust debris in stellar disks. A user could classify an object based on certain criteria such as whether the object appears oval -- a shape that more resembles a galaxy -- or round --a sign that the object is more likely a disk-hosting star.

"We have multiple citizen scientists look at each object and give their own independent opinion, and trust the wisdom of the crowd to decide what things are probably galaxies and what things are probably stars with disks around them," says study co-author Steven Silverberg, a postdoc in MIT's Kavli Institute for Astrophysics and Space Research.

From there, a science team including Silverberg follows up on crowd-classified disks, using more sophisticated methods and telescopes to determine if indeed they are disks, and what characteristics the disks may have.

In the case of the newly discovered W1200-7845, citizen scientists first classified the object as a disk in 2016. The science team, including Silverberg and Maria Schutte, a graduate student at the University of Oklahoma, then looked more closely at the source with an infrared instrument on the Magellan 6.5-meter telescopes at Las Campanas Observatory in Chile.

With these new observations, they determined that the source was indeed a disk around a brown dwarf that lived within a "moving group" -- a cluster of stars that tend to move as one across the night sky. In astronomy, it's far easier to determine the age of a group of objects rather than one alone. Because the brown dwarf was part of a moving group of about 30 stars, previous researchers were able to estimate an average age for the group, about 3.7 million years old, that was likely also the age of the brown dwarf.

The brown dwarf is also very close to the Earth, at about 102 parsecs away, making it the closest, young brown dwarf detected yet. For comparison, our nearest star, Alpha Centauri, is 1 parsec from Earth.

"When it's this close, we consider it to be within the solar neighborhood," Schutte says. "That proximity is really important, because brown dwarfs are lower in mass and inherently less bright than other objects like stars. So the closer these objects are to us, the more detail we'll be able to see."

Looking for Peter Pan

The team plans to zoom further in on W1200-7845 with other telescopes, such as ALMA, the Atacama Large Millimeter Array in Chile, comprising 66 huge radio dishes that work together as one powerful telescope to observe the universe between the radio and infrared bands. At this range and precision, the researchers hope to see the brown dwarf's disk itself, to measure its mass and radius.

"A disk's mass just tells you how much stuff is in the disk, which would tell us if planet formation happens around these systems, and what sorts of planets you'd be able to produce," Silverberg says. "You could also use that data to determine what kinds of gas are in the system which would tell you about the disk's composition."

In the meantime, the researchers are launching a new version of Disk Detective. In April 2019, the website went on hiatus, as its hosting platform, the popular citizen scientist portal Zooniverse, briefly retired its previous software platform in favor of an updated version. The updated platform has prompted Silverberg and his colleagues to revamp Disk Detective. The new version, launching this week, will include images from a full-sky survey, PanSTARRS, that observes most of the sky in high-resolution optical bands.

"We're getting more current images with different telescopes with better spatial resolution this time around," says Silverberg, who will be managing the new site at MIT.

Where the site's previous version was aimed at finding any disks around stars and other objects, the new site is designed to pick out "Peter Pan" disks -- disks of gas and dust that should be old enough to have formed planets, but for some reason haven't quite yet.

"We call them Peter Pan disks because they seem to never grow up," Silverberg says.

The team identified its first Peter Pan disk with Disk Detective in 2016. Since then, seven others have been found, each at least 20 million years old. With the new site, they hope to identify and study more of these disks, which could help to nail down conditions under which planets, and possibly life, may form.

"The disks we find will be excellent places to look for exoplanets," Silverberg says.

"If planets take longer to form than we previously thought, the star they orbit will have fewer gigantic flares when the planets finally form. If the planet receives fewer flares than it would around a younger star, that could significantly impact our expectations for discovering life there."

Credit: 
Massachusetts Institute of Technology

Smart devices should space out vibrations to maximize user alert benefits

We are constantly surrounded by sounds and vibrations in our environment, such as a ringing phone or a buzzing smart device like a wearable activity tracker. While such notifications from personal devices are an efficient way of alerting users to an incoming call or email, do they also distract users from what they are currently doing?

This was what a team of researchers from Yale-NUS College sought to find out.

The team, led by Yale-NUS Assistant Professor of Social Sciences (Psychology) Christopher Asplund and Singapore University of Technology and Design's Assistant Professor Simon Perrault, found that haptic feedback (such as vibration feedback) does cause distraction, but this loss of focus lasts only for about one second. The findings can help designers improve the usability of notification features in devices.

Information conveyed through haptic feedback has advantages as it can alert users privately (as compared to a ringing phone) and during physical activities. Moreover, there has been increased interest in further developing haptic interfaces in devices in recent years. While distraction from visual and auditory feedback has been extensively studied, Asst Prof Asplund explained that the distraction caused by haptic feedback remains poorly understood. This latest study provides new information on the attentional capture effects in haptic feedback and offers suggestions for designing alerts in smart devices. The study was published in May in ACM Transactions on Computer-Human Interaction.

Asst Prof Asplund said, "Distracting sounds and vibrations in the environment capture users' attention, and we wanted to understand its impact on doing other things. So if you are surprised by an unexpected vibration from your activity monitor, will you fail to notice your buzzing phone? The answer appears to be yes, but the timing matters: The distraction effects are strong but last for only about a second. That's why we think that devices could be designed to compensate for our distractibility, either by separating the sending of critical information in time or by detecting distracting events and then delaying the presentation of information to the user."

Hence, the team recommends that smart devices should have dynamically scheduled notifications where multiple alerts are separated by at least one second. In addition, devices can be designed to actively sense unexpected vibrations or sounds in the environment and consequently delay notifications till the optimal time gap is reached to minimise distractions to the user. For example, a smart watch could delay non-urgent notifications such as emails when the user is running.

Credit: 
Yale-NUS College

New CRISPR advance may solve key quandary

image: A photo of lead author Kelly Banas, PhD candidate.

Image: 
ChristianaCare Gene Editing Institute

A mutation unique to certain cancer tumors is a potential homing beacon for safely deploying CRISPR gene editing enzymes to disarm DNA that makes cancer cells resistant to treatment, while ignoring the gene in normal cells where it's critical to healthy function, according to a new study from ChristianaCare's Gene Editing Institute in the journal Molecular Cancer Research.

"This advance addresses a big challenge with using CRISPR in cancer patients, which is ensuring it can distinguish between a tumor cell and a normal cell," said Eric Kmiec, Ph.D., director of ChristianaCare's Gene Editing Institute and principal author of the study.

According to a commentary from journal editors accompanying the study, the process developed by the Gene Editing Institute can "provide an empirical basis for the use of CRISPR-directed gene therapy in solid tumor cells, and continue to advance the use of this technology closer to clinical implementation." Journal editors praised the study for "reporting on the molecular kinetics of CRISPR activity in lung cancer cells for the first time."

Kmiec said the primary focus of the study was to successfully use CRISPR to knock out a gene called NRF2 that protects squamous cell carcinoma lung cancer tumors from being affected by chemotherapy or radiation - but without affecting normal cells. In normal cells, NRF2 can help protect them from various types of damage.

Kmiec said the Gene Editing Institute has done multiple tests in animals to establish that disabling NRF2 with CRISPR increases sensitivity to chemotherapy. They are now conducting tests in animals to further confirm selective targeting of NRF2 in squamous cell tumors and to assess any safety concerns in order to lay the groundwork for a clinical trial in patients. The trial would test whether using CRISPR to knock out the NRF2 gene in squamous cell carcinoma lung cancer tumors improves the efficacy of conventional chemotherapy and radiation treatments. The study notes that the presence of the NRF2 gene in tumors confers a "dismal prognosis" because it protects tumors from being shrunk or destroyed by these therapies.

But Kmiec said there are several other cancers, including esophageal, head and neck, and certain forms of uterine and bladder cancer, that have similar features. They produce tumors that are frequently protected by the NRF2 gene. And like squamous cell tumors, they also have mutations that create what is technically known as a PAM site (short for protospacer adjacent motif) that can serve as a target for keeping CRISPR edits focused exclusively on tumors.

Kmiec and lead author Kelly Banas said the NRF2 gene typically shows up early in tumor development and can be detected by existing diagnostic tests. They said moving quickly with CRISPR to disable NRF2 could improve the efficacy of conventional treatments and potentially lower the dosages required to shrink tumors.

"In a way, we are trying to use the most advanced tool in medical science to enhance the efficacy of some of the mainstays of conventional cancer treatment," Banas said.

Banas said the inspiration for the study came during a conference of lung cancer specialists that included a discussion of genetic sequences that are unique to squamous cell tumors. She said she then set out to explore whether one of these mutations could serve as a "recognition site" for a CRISPR enzyme.

"I was basically looking for something unique to the NRF2 gene in tumor cells that could essentially tell CRISPR 'here is the site where I am supposed to bind and do my work," she said. "Without any targeted therapy available for this type of lung cancer, the ability to use CRISPR to safely disarm a key mechanism that allows tumors to grow even when being hit with chemotherapy could be an important advance."

CRISPR stands for "clustered regularly interspaced short palindromic repeats." It is a defense mechanism found in bacteria that can recognize and slice up the DNA of invading viruses. Scientists have learned how to modify this mechanism so it can be directed to "edit" specific sequences of DNA code. In patient applications, the goal is to use CRISPR to repair defective genes that can cause disease or eliminate or knock out sequences that are causing problems. But challenges arise when the sequences in question are present in both healthy and diseased cells.

Credit: 
Burness

In anti-piracy work, blocking websites more effective when multiple sites are targeted

An important challenge facing media industries today is whether and how copyright policy should be adapted to the realities of the digital age. The invention and subsequent adoption of filesharing technologies has eroded the strength of copyright law across many countries, and research has shown that digital piracy reduces sales of music and motion picture content. A new study that examined the effectiveness of anti-piracy efforts in the United Kingdom found that blocking websites can be effective but only when multiple channels are blocked. The website blocking policies in the U.K. caused a decrease in overall piracy and a 7 to 12% increase in the use of legal subscription sites.

The study, by researchers at Carnegie Mellon University and Chapman University, appears in MIS Quarterly.

"Government regulators in the U.K., copyright holders, and Internet platforms are pursuing a variety of efforts to respond to piracy," explains Michael D. Smith, professor of information technology and marketing at Carnegie Mellon University's Heinz College, who coauthored the study. "We found that to be effective, policies must block multiple channels of access to pirated content."

Researchers sought to determine what factors drive the success or failure of various anti-piracy enforcement actions, such as piracy website blocking, in the context of three court-ordered events in the United Kingdom in the mid-2010s. Specifically, they looked at Internet Service Providers' blocking of a single dominant site in 2012, blocking of 19 piracy sites in 2013, and blocking of 53 video piracy sites in 2014. In each case, no pirated content was removed from the Internet, but the enforcement actions attempted to block U.K. users from reaching pirated content through particular domains.

The study focused on supply-side enforcement efforts, which target the sites and networks that make pirated content available to consumers. This is distinct from demand-side enforcement efforts, which target consumers and have yielded mixed results.

The study found that blocking a single site in 2012 caused no increase in the use of legal sites, but instead caused users to increase visits to other unblocked piracy sites and virtual private network sites. However, blocking 19 piracy sites in 2013 and 53 sites in 2014 caused a decrease in piracy and an increase in use of legal subscription sites of 7 to 12%, as well an increase in new paid subscriptions.

The researchers conclude that the number of channels disrupted--and thus, the strength of the intervention--affected the effectiveness of this type of anti-piracy enforcement. They suggest it is likely that this was due to the increased search and learning costs associated with piracy--that is, the costs incurred when pirate sites are blocked and people have to invest time and effort to find and learn how to use new sites, or choose to use legal sites that cost money.

The study's authors note several limitations to their work: First, they studied legal consumption of media only through paid legal subscription sites; users may consume media legally in other ways. Second, the results may underestimate the effect of website blocking on legal consumption. And third, because the study looked at consumer activity for only three months after each wave of blocked channels, it could not determine how long the impacts lasted.

"Our results show that blocking access to popular pirate sites reduced overall piracy and increased consumers' use of paid legal channels," says Rahul Telang, professor of information systems and management at Carnegie Mellon University's Heinz College, who coauthored the study.

Credit: 
Carnegie Mellon University

COVID-19, fake science, and conspiracy theories

image: The Journal features rapid publication of emerging sequence information, reports on clinical trials of emerging HIV therapies, and images in HIV research.

Image: 
Mary Ann Liebert, Inc., publishers

New Rochelle, NY, May 22, 2020--What past scientific fraud is at the heart of some current anti-vaccine and anti-COVID-19 conspiracy theories? Read the details in the Commentary "Fake Science: XMRV, COVID-19, and the Toxic Legacy of Dr. Judy Mikovits." (in the peer-reviewed journal AIDS Research and Human Retroviruses.

The COVID-19 conspiracy theory put forth by Mikovits is demonstrably untrue, and much of it derives from a scientific fraud she and coworkers perpetrated in 2009.

"There is no legitimate debate to be had on these issues, and any credence given to these dangerous conspiracies will lead to even greater suffering resulting from COVID-19. Steer well clear of Plandemic and the claims of Judy Mikovits," say authors Stuart Neil, King's College, London, and Edward Campbell, Stritch School of Medicine, Loyola University Chicago.

Thomas Hope, PhD, Editor-in-Chief of AIDS Research and Human Retroviruses and Professor of Cell and Molecular Biology at Northwestern University, Feinberg School of Medicine, Chicago, IL states: "With the great desire of the public to understand what they can about the pandemic, combined with misguided and malignant disinformation as produced by Ms. Mikovits, it is critical that all scientists do everything they can to educate the public and disclose the fake or false information."

Credit: 
Mary Ann Liebert, Inc./Genetic Engineering News

Economic Development Quarterly announces a special issue on business incentives

KALAMAZOO, Mich.-- Local and state policymakers push economic development incentives to spur job creation and economic wealth. The outstanding question is, "do these various types of financial incentives--tax credits, abatements, grants, and others--work?" The selected research papers in the May issue of Economic Development Quarterly (EDQ) focus on estimating the effect of local and state financial incentives in shaping business location decisions.

In his introduction to the special issue, EDQ coeditor and Upjohn Institute Senior Economist Tim Bartik summarizes what we knew before and what we now know due to the findings and policy implications of the research. Bartik also offers suggestions on reforming business incentive policy and discusses where additional research is specifically needed.

Bartik is well known for his work on economic development and financial business incentives, including research developing a unique comprehensive database on economic development incentive programs for most of the United States. His Panel Database on Incentives and Taxes (PDIT) and the Upjohn Institute's WholeData Establishment and Employment database were made available to researchers for this special issue.

"Incentives research needs to get more specific: What incentives work best for what industries in what type of local economy?" writes Bartik. "Researchers have produced evidence that simply expanding incentives does not work well enough to justify the added incentive costs, and that many state and local economies would be better off cutting back on overall incentives. But policy makers legitimately want to know what DOES work to create jobs, if our current incentive practices do not work well."

Credit: 
SAGE

Genetic risk scores may improve clinical identification of patients with heart attack risk

Genetic variants have been linked with a higher risk of having a heart attack, permitting the calculation of "polygenic risk scores" (PRS) that quantify patients' inherited susceptibility based on the number of variants they have.

A team led by investigators at Massachusetts General Hospital and the Broad Institute of Massachusetts Institute of Technology and Harvard recently found that applying PRS can identify at-risk patients who are not presently identified through standard clinical evaluations.

In a study reported in the Journal of the American College of Cardiology, the team applied PRS to 47,108 individuals who were an average of 60 years old and were receiving care across three U.S. health care systems (in Massachusetts, Pennsylvania and New York). The PRS strongly associated with the presence of coronary artery disease, which is the cause of heart attacks. Specifically, those with scores in the top 20% were 1.9-times more likely to have developed disease compared with the remaining 80% of the population. Importantly, those with high PRS were not more likely than others to have been previously recognized as high risk by their physicians.

"We identified a subset of individuals at double the risk of heart attack on the basis of their genes. Despite this elevated risk, these individuals were neither more likely to be flagged as high risk, nor more likely to receive preventive statin therapy per our conventional clinical practices--a consistent finding across all three health systems studied," said lead author Krishna Aragam, MD, a cardiologist at Mass General and an instructor in Medicine at Harvard Medical School.

The researchers determined that if coronary artery disease PRS were considered alongside current national guidelines, an additional 4.1% of patients who have not yet experienced a heart attack may be recommended to receive cholesterol-lowering statins. "When coupled with clinical assessments, we estimate that genetic testing may uniquely identify a need for preventive statin therapy in approximately 1 in every 25 of such patients," said Dr. Aragam.

The authors note that assessments of polygenic risk are becoming more pervasive through research-based and direct-to-consumer services. They suggest that calculating patients' genetic risk for heart attacks may help to improve upon current clinical assessments of risk, particularly in situations of clinical uncertainty. "Within our present frameworks for heart attack prevention, we speculate that genetic testing may be most immediately useful to guide clinical management for patients otherwise falling in a 'gray area' of intermediate risk based on standard clinical factors," Dr. Aragam said.

Credit: 
Massachusetts General Hospital

Pulmonary embolism and COVID-19

image: Pallavi Bhargava, M.D., an infectious diseases physician and study co-author.

Image: 
Henry Ford Health System

DETROIT - Researchers at Henry Ford Health System in Detroit say early diagnosis of a life-threatening blood clot in the lungs led to swifter treatment intervention in COVID-19 patients.

In a new study published recently in the journal Radiology, researchers found that 51 percent of patients found to have a pulmonary embolism, or PE, were diagnosed in the Emergency Department, the entry point for patients being admitted to the hospital.

In Europe, research has shown that most cases of PE were diagnosed in patients admitted to the intensive care unit after being on a ventilator for several days.

In the Henry Ford study, researchers say 72 percent of PE diagnoses were in patients who did not require "ICU-level care," suggesting that timely diagnosis and use of blood thinners could have played a role in the treatment process.

"Based on our study, early detection of PE could further enhance and optimize treatment for patients first presenting in the Emergency Department," says Pallavi Bhargava, M.D., an infectious diseases physician involved in the study. "We advise clinicians to think of PE as an additional complication early on during the admission of patients whose symptoms and lab results point to that condition."

Thomas Song, M.D., a radiologist and the study's senior author, says a timely pulmonary CT angiography made the difference in the PE diagnosis. "We recommend CT angiography because a traditional CT scan may not pick up the blood clot," Dr. Song says.

In addition to the early detection finding, other key highlights emerged from the retrospective study of 328 COVID-19 patients who underwent a pulmonary CT angiography between March 16 and April 18 at Henry Ford's acute care hospitals:

22 percent of patients were found to have a pulmonary embolism.

Patients with a BMI (body mass index) of 30 or higher are nearly three times more at risk for developing a pulmonary embolism. The ideal BMI for adults is 18.5 - 24.9.

Patients on statin therapy prior to admission were less likely to develop a pulmonary embolism.

Increased D-dimer and C-reactive protein lab markers, in conjunction with a rising oxygen requirement, may be a predictor of a pulmonary embolism, even when patients are receiving preventive blood thinners.

"Our findings suggest that patients who test positive for COVID-19 should be started on preventive blood thinners early on in their treatment and that the need for CT angiography be assessed on a case by case basis to look for blood clots," Dr. Bhargava says. "Our ER doctors played a key role in meticulously assessing these patients, evaluating their d-dimer marker value and ordering the right CT scans to identify these blood clots so early in the diagnosis."

Credit: 
Henry Ford Health