Tech

Common Wifi can detect weapons, bombs and chemicals in bags

image: Using common WiFi, this low-cost suspicious object detection system can detect weapons, bombs and explosive chemicals in bags, backpacks and luggage.

Image: 
Data Analysis and Information Security (DAISY) Lab led by Professor Yingying Chen

Ordinary WiFi can easily detect weapons, bombs and explosive chemicals in bags at museums, stadiums, theme parks, schools and other public venues, according to a Rutgers University-New Brunswick-led study.

The researchers' suspicious object detection system is easy to set up, reduces security screening costs and avoids invading privacy such as when screeners open and inspect bags, backpacks and luggage. Traditional screening typically requires high staffing levels and costly specialized equipment.

"This could have a great impact in protecting the public from dangerous objects," said Yingying (Jennifer) Chen, study co-author and a professor in the Department of Electrical and Computer Engineering in Rutgers-New Brunswick's School of Engineering. "There's a growing need for that now."

The peer-reviewed study received a best paper award at the 2018 IEEE Conference on Communications and Network Security on cybersecurity. The study - led by researchers at the Wireless Information Network Laboratory (WINLAB) in the School of Engineering - included engineers at Indiana University-Purdue University Indianapolis (IUPUI) and Binghamton University.

WiFi, or wireless, signals in most public places can penetrate bags to get the dimensions of dangerous metal objects and identify them, including weapons, aluminum cans, laptops and batteries for bombs. WiFi can also be used to estimate the volume of liquids such as water, acid, alcohol and other chemicals for explosives, according to the researchers.

This low-cost system requires a WiFi device with two to three antennas and can be integrated into existing WiFi networks. The system analyzes what happens when wireless signals penetrate and bounce off objects and materials.

Experiments with 15 types of objects and six types of bags demonstrated detection accuracy rates of 99 percent for dangerous objects, 98 percent for metal and 95 percent for liquid. For typical backpacks, the accuracy rate exceeds 95 percent and drops to about 90 percent when objects inside bags are wrapped, Chen said.

"In large public areas, it's hard to set up expensive screening infrastructure like what's in airports," Chen said. "Manpower is always needed to check bags and we wanted to develop a complementary method to try to reduce manpower."

Next steps include trying to boost accuracy in identifying objects by imaging their shapes and estimating liquid volumes, she said.

Credit: 
Rutgers University

Healthy fat cells uncouple obesity from diabetes

image: Corresponding authors, Dr. Sean Hartig, is on the left, and Natasha Chernis.

Image: 
Baylor College of Medicine

About 422 million people around the world, including more than 30 million Americans, have diabetes. Approximately ninety percent of them have type 2 diabetes. People with this condition cannot effectively use insulin, a hormone made by the pancreas that helps the body turn blood sugar (glucose) into energy.

The inability to use insulin, called insulin resistance, results in increasing levels of blood sugar, which, if not controlled, can significantly raise the risk of major health problems such as blindness, kidney failure, heart attacks, stroke and lower limb amputation. In 2015, the World Health Organization estimated that 1.6 million deaths were directly caused by diabetes. Until recently, this type of diabetes was only seen in adults, but it is now also occurring increasingly and more frequently in children.

"Obesity is the most significant risk factor for type 2 diabetes and other metabolic conditions, and affects one in three adults worldwide," said Dr. Sean Hartig, assistant professor of medicine and of molecular and cellular biology at Baylor College of Medicine. "Although medical consensus recommends making life style changes toward a healthy diet and increased physical activity to both prevent and help control diabetes, this strategy has shown to be difficult to implement and maintain by most people."

Hartig and his colleagues are exploring alternative ways to control obesity and type 2 diabetes that may involve the use of therapies that would complement the current efforts to educate the public about healthy diets and exercise routines. To achieve this goal, they are studying the cellular and molecular mechanisms involved in fat metabolism using both genetic mouse models and human tissues.

Subcutaneous white fat versus belly fat dictates metabolic health in obesity

Although obesity significantly increases the risk of diabetes, about 30 percent of obese people do not show insulin resistance and do not develop type 2 diabetes or other metabolic conditions, such as fatty liver disease. What leads to obesity while maintaining insulin sensitivity is not well understood; however, scientists know that the condition is associated with the body's ability to expand the storage of subcutaneous white adipose (fat) tissue.

"Subcutaneous white fat represents 80 percent of all fat tissue in mice and people and it is stored in the hips, arms and legs. When energy intake (food) overwhelms the ability to store calories in subcutaneous white fat, fat 'spills over' into organs that are not specialized for storing fat, such as the liver, the pancreas and muscle," said co-author Natasha Chernis, research technician at Baylor College of Medicine. "People who develop diabetes have more abdominal (belly) fat. Our idea is to find ways to expand subcutaneous white fat depots in obesity, so fat is not stored in places like the abdomen or the liver, where it can cause metabolic problems."

Another key player in the obesity and diabetes puzzle is the immune system. Obesity leads to developing a low-grade inflammatory response that can interfere with the metabolic functions of subcutaneous white fat tissue. This inflammatory microenvironment likely disturbs this fat tissue's ability to respond to insulin, contributing in insulin resistance and type 2 diabetes. This is supported by findings that increased levels of pro-inflammatory cytokines, such as interferon-gamma, correlate with insulin resistance, reduced subcutaneous white fat expansion and accumulation of abdominal fat. However, this brings the question, what is different in obese individuals who do not develop insulin resistance and diabetes?

Another piece of the puzzle, miR-30a

"When we started this project six years ago, our goal was to better understand fat metabolism and identify potential ways to help people lose weight," Hartig said. "We found a microRNA called miR-30a - a small non-coding RNA molecule that regulates gene expression - that could stimulate pathways important for fat metabolism. Originally, we thought that expressing miR-30a would lead to weight loss because it would be driving fat metabolism, but we observed something different. We found miR-30a did not correlate with leanness; instead, it was associated with a form of obesity in which subjects actually maintained insulin sensitivity."

Hartig and his colleagues discovered that reduced miR-30a expression in fat tissue correlated with insulin resistance in both obese mice and obese humans. Interestingly, overexpressing miR-30a in subcutaneous white fat tissue of obese mice significantly improved insulin sensitivity, reduced levels of blood lipids and decreased buildup of fat in the liver without altering body weight. In addition, the researchers found that miR-30a expression reduced inflammation in subcutaneous white fat tissue.

"We have provided evidence that expression of miR-30a protects fat cells by attenuating inflammation derived from mediators such as interferon gamma and leads to improved insulin sensitivity in obese mice," Hartig said.

These findings open the possibility of developing therapeutic entry ways for many forms of diabetes, not just diabetes aligned with obesity. For instance, targeting components of the immune system locally within adipose tissue may enable subcutaneous white fat to expand appropriately in lipodystrophies - conditions characterized by abnormal distribution of body fat - where diabetes occurs in patients without obesity.

"We are interested in this idea that we can uncouple obesity from co-morbidities such as heart disease and insulin resistance," Hartig said. "It has become clear in the past 10 years that obesity doesn't mean diabetes. We are interested in learning how to manipulate the inflammatory response inside fat tissue of people with insulin resistance or type 2 diabetes so they expand the subcutaneous white fat deposits and become metabolically healthy."

Read all the details of this study in the journal Diabetes.

Credit: 
Baylor College of Medicine

Ayahuasca risk: Potent psychedelic DMT mimics near-death experience in the brain

A powerful psychedelic compound found in ayahuasca can model near-death experiences in the brain, a study has found.

Near-death experiences, or NDEs, are significant psychological events that occur close to actual or perceived impending death. Commonly reported aspects of NDEs include out of body experiences, feelings of transitioning to another world and of inner peace, many of which are also reported by users taking DMT.

DMT is a potent psychedelic found in certain plants and animals, and is the major psychoactive compound in ayahuasca, the psychedelic brew prepared from vines and used in ceremonies in south and central America.

Researchers from Imperial College London set out to look at the similarities between the DMT experience and reports of NDEs. Their findings, published today in the journal Frontiers in Psychology, reveal a large overlap between those who have had NDEs and healthy volunteers administered DMT.

As part of the trial, the team looked at 13 healthy volunteers over two sessions, who were given intravenous DMT and placebo, receiving one of four doses of the compound. The research was carried out at the NIHR Imperial Clinical Research Facility. All volunteers were screened and overseen by medical staff throughout.

Researchers compared the participants' experiences against a sample of 67 people who had previously reported actual NDEs and who had completed a standardised questionnaire to try and quantify their experiences. The group were asked a total of 16 questions including 'Did scenes from your past come back to you?' and 'Did you see, or feel surrounded by, a brilliant light?'.

Following each dosing session, the 13 healthy volunteers filled out exactly the same questionnaire to find out what sort of experiences they had whilst on DMT and how this compared to the NDE group.

The team found that all volunteers scored above a given threshold for determining an NDE, showing that DMT could indeed mimic actual near death experiences and to a comparable intensity as those who have actually had an NDE.

Dr Robin Carhart-Harris, who leads the Psychedelic Research Group at Imperial and supervised the study, said: "These findings are important as they remind us that NDE occur because of significant changes in the way the brain is working, not because of something beyond the brain. DMT is a remarkable tool that can enable us to study and thus better understand the psychology and biology of dying."

Professor David Nutt, Edmond J Safra Chair in Neuropsychopharmacology at Imperial, said: "These data suggest that the well-recognised life-changing effects of both DMT and NDE might have the same neuroscientific basis."

PhD candidate Chris Timmermann, a member of the Psychedelic Research Group at Imperial and first author of the study, said: "Our findings show a striking similarity between the types of experiences people are having when they take DMT and people who have reported a near-death experience."

The researchers note some subtle, but important differences between DMT and NDE responses, however. DMT was more likely to be associated with feelings of 'entering an unearthly realm', whereas actual NDEs brought stronger feelings of 'coming to a point of no return'. The team explain that this may be down to context, with volunteers being screened, undergoing psychological preparation beforehand and being monitored through in a 'safe' environment.

"Emotions and context are particularly important in near-death experiences and with psychedelic substances," explains Timmermann. "While there may be some overlap between NDE and DMT-induced experiences, the contexts in which they occur are very different."

"DMT is a potent psychedelic and it may be that it is able to alter brain activity in a similar fashion as when NDEs occur.

"We hope to conduct further studies to measure the changes in brain activity that occur when people have taken the compound. This, together with other work, will help us to explore not only the effects on the brain, but whether they might possibly be of medicinal benefit in future."

The authors caution that while the initial findings are interesting, they advise against self-medication with ayahuasca.

Credit: 
Imperial College London

NASA's IMERG estimates heavy rainfall over the eastern US

video: During the period from August 6 to early Aug. 13, 2018, IMERG data indicated that the highest rainfall accumulations of greater than 8 inches (203 mm) occurred over Texas. Areas of heavy rainfall accumulations above 4 inches (101.6 mm) were indicated in many other states from the south-central to northeastern United States.

Image: 
Credits: NASA/JAXA, Hal Pierce

Most of the Eastern half of the United States had rainfall during the past week. Some parts of the country experienced heavy rainfall that resulted in flash floods and various other problems. NASA added up that rainfall using satellite data and a program called IMERG to provide a look at the amount of rainfall along the eastern U.S.

Slow-moving storm systems and nearly stationary fronts were the cause of heavy rainfall over Virginia this past weekend of Aug. 11 and 12. Several trees were brought down by a severe storm that hit Fredericksburg, Virginia Sunday Afternoon. Fallen trees blocked several roads, flash flooding occurred and electrical power was lost in that area.

Continuing heavy rain also fell in Texas over the weekend. This provided some badly needed drought relief in that area but heavy rainfall also resulted in over three dozen people being rescued on Sunday from flood waters along the Nueces River in south-central Texas.

The Global Precipitation Measurement mission or GPM's constellation of satellites provided rainfall data to provide the rainfall estimates. GPM is a joint mission between NASA and the Japan Aerospace Exploration Agency, JAXA.

At NASA's Goddard Space Flight Center in Greenbelt, Md. these rainfall accumulation estimates were derived from NASA's Integrated Multi-satellitE Retrievals data (IMERG). IMERG data were used to calculate estimates of precipitation totals from a combination of space-borne passive microwave sensors, including the GMI microwave sensor on the GPM satellite, and geostationary infrared data.

NASA's Precipitation Measurement Missions (PMM) science team has developed algorithms that support GPM Missions such as IMERG. This analysis shows an estimate of IMERG rainfall accumulation totals during the period from August 6 to early August 13, 2018. IMERG data indicated that the highest rainfall accumulations of greater than 8 inches (203 mm) occurred over Texas.

Areas of heavy rainfall accumulations above 4 inches (101.6 mm) were indicated in many other states from the south-central to northeastern United States.

On Aug. 13, the National Weather Service Weather Prediction Center (WPC), College Park Md. noted that more rainfall is expected along the U.S. East Coast. WPC said, "A quasi-stationary front extending from the Mid-Atlantic southward into the Southeast and the Eastern Gulf Coast will aid in producing showers and thunderstorms over the region through Tuesday evening. In addition, a quasi-stationary upper-level low over the Ohio Valley/Central Appalachians will slowly move eastward to the Northern Mid-Atlantic Coast by Tuesday evening[, Aug. 14]. The energy will aid in producing showers and thunderstorms from parts of the Ohio Valley/Lower Great Lakes to the Northeast through Tuesday evening into Monday. The system will have slight diurnal component with the greatest areal coverage during the late afternoon into the late evening on Monday and Tuesday."

Credit: 
NASA/Goddard Space Flight Center

In cell cultures, e-cigarette vapor boosts inflammation

E-cigarette vapour boosts the production of inflammatory chemicals and disables key protective cells in the lung that keep the air spaces clear of potentially harmful particles, reveals a small experimental study, published online in the journal Thorax.

The vapour impairs the activity of alveolar macrophages, which engulf and remove dust particles, bacteria, and allergens that have evaded the other mechanical defences of the respiratory tract.

The findings prompt the researchers to suggest that while further research is needed to better understand the long term health impact of vaping on people, e-cigarettes may be more harmful than we think, as some of the effects were similar to those seen in regular smokers and people with chronic lung disease.

Vaping is increasing in popularity, but most of the current body of research has focused on the chemical composition of e-cigarette liquid before it is vaped.

To find out how vaping might change this chemical soup, and what impact this might have, the researchers devised a mechanical procedure to mimic vaping and produce condensate from the vapour.

They extracted alveolar macrophages from lung tissue samples provided by eight non-smokers who had never had asthma or chronic obstructive pulmonary disease (COPD).

A third of the cells were exposed to plain e-cigarette fluid, a third to different strengths of the artificially vaped condensate with and without nicotine, and a third to nothing for 24 hours.

The results showed that the condensate was significantly more harmful to the cells than e-cigarette fluid and that these effects worsened as the 'dose' increased.

After 24 hours of exposure the total number of viable cells exposed to the vaped condensate was significantly reduced compared to the untreated cells, and condensate containing nicotine exaggerated this effect.

Exposure to the condensate increased cell death and boosted production of oxygen free radicals by a factor of 50, and it significantly increased the production of inflammatory chemicals--more so when the condensate contained nicotine.

What's more, the ability of cells exposed to vaped condensate to engulf bacteria was significantly impaired, although treatment with an antioxidant restored this function and helped lessen some of the other harmful effects.

The researchers conclude that the vaping process itself can damage vital immune system cells, at least under laboratory conditions.

"Importantly, exposure of macrophages to [e-cigarette vapour condensate] induced many of the same cellular and functional changes in [alveolar macrophage] function seen in cigarette smokers and patients with COPD," they write.

In an accompanying podcast, lead author Professor David Thickett explains that many e-cigarette companies have been bought up by the tobacco giants, "and there's certainly an agenda to portray e-cigarettes as safe."

While e-cigarettes are safer than traditional cigarettes, they may still be harmful in the long term, he says, as the current body of research is in its infancy and not able to answer that question yet.

"In terms of cancer causing molecules in cigarette smoke, as opposed to cigarette vapour, there are certainly reduced numbers of carcinogens. They are safer in terms of cancer risk, but if you vape for 20 or 30 years and this can cause COPD, then that's something we need to know about," he states.

"I don't believe e-cigarettes are more harmful than ordinary cigarettes," he concludes. "But we should have a cautious scepticism that they are as safe as we are being led to believe."

Credit: 
BMJ Group

New technology can detect hundreds of proteins in a single sample

New technology developed by a team of McGill University scientists shows potential to streamline the analysis of proteins, offering a quick, high volume and cost-effective tool to hospitals and research labs alike.

Proteins found in blood provide scientists and clinicians with key information on our health. These biological markers can determine if a chest pain is caused by a cardiac event or if a patient has cancer.

Unfortunately, the tools used to detect such proteins haven't evolved much over the past 50 years - despite there being over 20,000 proteins in our body, the vast majority of protein tests run today target only a single protein at-a-time.

Now, PhD candidate Milad Dagher, Professor David Juncker and colleagues in McGill's Department of Biomedical Engineering have devised a technique that can detect hundreds of proteins with a single blood sample.

Part of their work, just published in Nature Nanotechnology, describes a new and improved way to barcode micro-beads using multicolour fluorescent dyes. By generating upwards of 500 differently coloured micro-beads, their new barcoding platform enables detection of markers in parallel from the same solution--for example, a blue barcode can be used to detect marker 1, while a red barcode can detect marker 2, and so on. A laser-based instrument called a cytometer then counts the proteins that stick to the different coloured beads.

Though this kind of analysis method has been available for some time, interference among multicolour dyes has limited the ability to generate the right colours. Now, a new algorithm developed by the team enables different colours of micro-beads to be generated with high accuracy--much like a colour wheel can be used to predict the outcome of colour mixing.

Professor Juncker's team is hoping to leverage its platform for improved analysis of proteins.

"Current technologies hold a major trade-off between the number of proteins that can be measured at once, and the cost and accuracy of a test", Dagher explains. "This means that large-scale studies, such as clinical trials, are underpowered because they tend to fall back on tried-and-true platforms with limited capabilities."

Their upcoming work focuses on maintaining accurate detection of proteins with increased scale.

Credit: 
McGill University

Another step forward on universal quantum computer

image: This is a nitrogen-vacancy (NV) center in diamond with two crossed wires for holonomic quantum gates over the geometric spin qubit with a polarized microwave.

Image: 
YOKOHAMA NATIONAL UNIVERSITY

Yokohama, Japan - Researchers have demonstrated holonomic quantum gates under zero-magnetic field at room temperature, which will enable the realization of fast and fault-tolerant universal quantum computers.

A quantum computer is a powerful machine with the potential to solve complex problems much faster than today's conventional computer can. Researchers are currently working on the next step in quantum computing: building a universal quantum computer.

The paper, published in the journal Nature Communications, reports experimental demonstration of non-adiabatic and non-abelian holonomic quantum gates over a geometric spin qubit on an electron or nitrogen nucleus, which paves the way to realizing a universal quantum computer.

The geometric phase is currently a key issue in quantum physics. A holonomic quantum gate manipulating purely the geometric phase in the degenerate ground state system is believed to be an ideal way to build a fault-tolerant universal quantum computer. The geometric phase gate or holonomic quantum gate has been experimentally demonstrated in several quantum systems including nitrogen-vacancy (NV) centers in diamond. However, previous experiments required microwaves or light waves to manipulate the non-degenerate subspace, leading to the degradation of gate fidelity due to unwanted interference of the dynamic phase.

"To avoid unwanted interference, we used a degenerate subspace of the triplet spin qutrit to form an ideal logical qubit, which we call a geometric spin qubit, in an NV center. This method facilitated fast and precise geometric gates at a temperature below 10 K, and the gate fidelity was limited by radiative relaxation," says the corresponding author Hideo Kosaka, Professor, Yokohama National University. "Based on this method, in combination with polarized microwaves, we succeeded in manipulation of the geometric phase in an NV center in diamond under a zero-magnetic field at room temperature."

The group also demonstrated a two-qubit holonomic gate to show universality by manipulating the electron-nucleus entanglement. The scheme renders a purely holonomic gate without requiring an energy gap, which would have induced dynamic phase interference to degrade the gate fidelity, and thus enables precise and fast control over long-lived quantum memories, for realizing quantum repeaters interfacing between universal quantum computers and secure communication networks.

Credit: 
Yokohama National University

Women with intellectual and developmental disabilities have almost double the rate of repeat pregnancy

Women with intellectual and developmental disabilities have nearly double the rate of having another baby within a year of delivering compared to women without such disabilities, according to a new study published in CMAJ (Canadian Medical Association Journal).

Rapid repeat pregnancy within one year of a previous live birth is associated with smaller babies, preterm birth, neonatal death and other adverse effects. It also indicates a lack of access to reproductive health care, such as pregnancy planning and contraception.

About one in 100 adults have an intellectual or developmental disability, such as autism-spectrum disorder, Down syndrome, fetal alcohol syndrome and other nonspecific conditions that cause intellectual and developmental limitations.

Researchers analyzed data on 2855 women with intellectual and developmental disabilities compared with 923 367 women without such disabilities who had a live birth between 2002 and 2013. They found that 7.6% of women with these disabilities had another baby within a year, compared to 3.9% of women without these disabilities.

"Women with intellectual and developmental disabilities are more likely than those without such disabilities to be young and disadvantaged in each marker of social, health, and health care disparities. They experience high rates of poverty and chronic physical and mental illness, and have poor access to primary care," says Hilary Brown, an adjunct scientist at Institute for Clinical Evaluative Sciences (ICES) and lead author of the study.

Rapid repeat pregnancies in women with intellectual and developmental disabilities ended in induced abortion (49%), live birth (33%) and pregnancy loss (18%) compared with induced abortion (59%), pregnancy loss (22%) and live birth (19%) in women without these disabilities.

"This study shows that current efforts to promote reproductive health might not be reaching women with intellectual and developmental disabilities and that there is a lot more we can do to educate and support these women in relation to pregnancy planning and contraception," adds Brown.

Credit: 
Canadian Medical Association Journal

The danger of coronary artery compression in children is more common than we think

image: Images of the different imaging techniques used to diagnose one of the patients who displayed symptoms. Chest x-ray in the posterioranterior (A) and lateral (B) projections, showing the classic pattern of cardiac strangulation from epicardial leads, as the ICD lead courses leftward and posterior around the heart. Computed topography (C) shows the ICD lead constricting the left ventricle and obtuse marginal branch of the circumflex artery. Catheter angiography (D) shows loss of contrast within the obtuse marginal branch as it courses below the ICD lead.

Image: 
HeartRhythm

Philadelphia, August 13, 2018 - The incidence of coronary artery compression in children fitted with epicardial pacemakers may be slightly more common than previously believed, say noted cardiologists. After reviewing patient records at Boston Children's Hospital, they advocate for stricter monitoring to identify patients at risk and prevent complications. Their recommendations are published as a featured article in the journal HeartRhythm, the official journal of the Heart Rhythm Society and the Cardiac Electrophysiology Society.

Children who require pacemakers or defibrillators often need to have wires placed on the outside of their heart due to their size or unique anatomy. In rare instances, these wires can place the child at risk for "cardiac strangulation," which can lead to compression of the heart muscle and coronary arteries (the blood vessels that feed the heart) over time.

"Coronary artery compression is thought to be rare," explained lead investigator Douglas Y. Mah, MD, director of the Pacemaker and ICD Program in the Department of Cardiology, Boston Children's Hospital, and assistant professor of pediatrics, Harvard Medical School, Boston, MA, USA. "Its true incidence, however, may be higher than we believed due either to a lack of awareness or lack of reporting in the literature."

The sudden death of a child with an epicardial pacemaker following coronary artery compression prompted investigators to enhance surveillance of all patients with epicardial pacing or defibrillation systems. They reviewed the records of all patients followed at Boston Children's Hospital from 2000-2017 who had either active or abandoned epicardial wires that included coronary imaging, either by computed topography (CT) scan or catheter angiography through the vessels in the leg. Of 145 patients, eight (5.5 percent) exhibited some degree of coronary compression from their epicardial leads. Six of these patients displayed symptoms; in addition to the case of sudden death, there were three cases of chest pain and two cases of unexplained fatigue. As a result of the review, seven patients underwent surgical removal or repositioning of their epicardial leads.

This study helps provide a framework for monitoring patients with epicardial pacemakers or defibrillators and identifying those who may need revision or removal of their epicardial wires. Dr. Mah and colleagues compared three screening techniques. They recommend that pediatric patients with epicardial devices should get screening chest x-rays every few years to assess how their wires look in relation to their heart, as the positioning may change as the child grows. They found that chest x-ray had a high specificity and was a good screening tool, easy to perform, inexpensive, and non-invasive. However, it can produce some false-negatives even when patients were symptomatic.

The authors propose that patients with concerning chest x-rays, symptoms such as unexplained chest pain or tiredness, or evidence of heart muscle damage or dysfunction should ideally have a cine CT scan that can image the heart moving in relation to the epicardial wires. Although this can also result in a false-positive, CT is less risky for pediatric patients because radiation doses are now much lower for this non-invasive imaging method.

If cine CT is not available, they advocate that patients undergo catheter angiography to confirm the diagnosis before taking a patient to surgery.

"The use of pacemakers and defibrillators in children is growing," noted Dr. Mah. "As more epicardial devices are implanted, more children may be at risk for developing coronary compression from their leads. We hope to increase awareness among healthcare providers and patients of this important, possibly preventable, and potentially fatal complication and provide a useful screening algorithm to detect at-risk patients and ultimately prevent complications."

"This article clearly emphasizes the need to not only carefully evaluate the potential site of electrode head fixation to avoid coronary injury, but also the need to evaluate closely where to route the electrode body to the device pocket," commented Gerald A. Serwer, MD, FHRS, pediatric cardiologist at the University of Michigan's C.S. Mott Children's Hospital, Michigan Medicine, Ann Arbor, MI, USA, in an accompanying editorial.

Dr. Serwer emphasizes that all cardiologists who have patients with epicardial electrodes should always be aware of this potential complication and periodically assess patients for coronary issues with at least a periodic chest x-ray. When evidence strongly suggests ischemia secondary to coronary compression due to electrode position, electrode replacement must be considered in view of the potential morbidity and mortality. "I strongly concur with the authors that any additional information one can obtain to aid in risk assessment would be of benefit and agree with them that additional studies to establish the efficacy of nuclear cardiology techniques are indicated," concluded Dr. Serwer.

Credit: 
Elsevier

Breaking down the Wiedemann-Franz law

image: Artistic impression of the two temperature-imbalanced reservoirs of cold atoms connected through a quantum point contact. The temperature balance is induced by a laser beam.

Image: 
ETH Zurich/Esslinger group

From everyday experience we know that metals are good conductors for both electricity and heat -- think inductive cooking or electronic devices warming up upon intense use. That intimate link of heat and electrical transport is no coincidence. In typical metals both sorts of conductivity arise from the flow of 'free' electrons, which move like a gas of independent particles through the material. But when fermionic carriers such as electrons interact with one another, then unexpected phenomena can arise, as Dominik Husmann, Laura Corman and colleagues in the group of Tilman Esslinger in the Department of Physics at ETH Zurich -- in collaboration with Jean-Philippe Brantut at the École Polytechnique Fédérale de Lausanne (EPFL) -- report in a paper published this week in the journal Proceedings of the National Academy of Sciences. Studying heat and particle conduction in a systems of strongly interacting fermionic atoms they found a range of puzzling behaviours that set this system apart from known systems in which the two forms of transport are coupled.

In metals, the connection of thermal and electrical conductivity is described by the Wiedemann-Franz law, which has first been formulated in 1853. In its modern form the law states that at a fixed temperature, the ratio between the two types of conductivity is constant. The value of that ratio is quite universal, being the same for a remarkably wide range of metals and conditions. That universality breaks down, however, when the carriers interact with one another. This has been observed in a handful of exotic metals hosting strongly correlated electrons. But Husmann, Corman and their co-workers have now explored the phenomenon in a system in which they had exquisite control over all relevant parameters, enabling them to monitor particle and heat transport in unprecedented detail.

Clean transport

The carriers in their experiments are fermionic lithium atoms, which they cooled to sub-microkelvin temperatures and trapped using laser beams. Initially, they confined a few hundred thousand of these atoms to two independent reservoirs that can be heated individually. Once a temperature difference between the two reservoirs had been established, they opened a tiny restriction between them -- a so-called quantum point contact -- thus initiating transport of particles and heat (see the figure). The transport channel is defined and controlled using laser light as well. The experiment therefore provides an extraordinarily clean platform for studying fermionic transport. For example, in real materials, the lattice through which the electrons flow starts to melt at high temperatures. In contrast, in the cold-atom setup, with the structures defined by light, no such 'lattice heating' occurs, making it possible to focus on the carriers themselves.

Puzzling behaviour

When Husmann et al. determined the ratio between thermal and particle conductivity in their system, they found it to be an order of magnitude below the predictions of the Wiedemann-Franz law. This deviation indicates a separation of the mechanisms responsible for particle and heat currents, in contrast to the situation so universally observed for free carriers. As a result, their system evolved into a state in which heat and particle currents vanished long before an equilibrium between the two reservoirs in terms of temperature and particle number has been reached.

Moreover, another measure for thermoelectric behaviour, the Seebeck coefficient, was found to have a value close to that expected for a non-interacting Fermi gas. This is puzzling, because in some regions of the channel the strongly interacting atoms were in the superfluid regime (in which a gas or liquid flows without viscosity) and in the prototypical superfluid, helium-4, the Seebeck coefficient is zero. This discrepancy signals a different thermoelectric character for the fermionic gas studied by the ETH team.

These findings therefore pose new challenges for microscopic modelling of strongly interacting fermion systems. At the same time, the platform established with these experiments could help to explore novel concepts for thermoelectric devices, such as coolers and engines that are based on interconverting temperature differences into particle flow, and vice versa.

Credit: 
ETH Zurich Department of Physics

New type of bed net could help fight against malaria

image: A study bed net hanging up in a living space in Burkina Faso (from fieldwork).

Image: 
Durham University/Steve Lindsay

A new type of bed net could prevent millions of cases of malaria, according to new research published in The Lancet today (10 August).

The two-year clinical trial in Burkina Faso, West Africa involving 2,000 children showed that the number of cases of clinical malaria was reduced by 12 per cent with the new type of mosquito net compared to the conventional one used normally.

The study resulted from a collaboration of scientists from Durham University (UK), Centre National de Recherche et de Formation sur le Paludisme (Burkina Faso), Liverpool School of Tropical Medicine (UK) and the Swiss Tropical and Public Health Institute (Switzerland).

It found that:

The number of cases of clinical malaria reduced by 12 per cent with the new type of mosquito net compared to conventional nets.

Children sleeping under the new bed nets were 52 per cent less likely to be moderately anaemic than those with a conventional net. Malaria anaemia is a major cause of mortality in children under two years old.

In areas with the new combination bed nets, there was a 51 per cent reduction in risk of a malaria-infective mosquito bite compared to areas with conventional nets.

Blood-seeking malaria mosquitoes (female Anopheles mosquitoes) are increasingly becoming resistant to the most common insecticides, called pyrethroids, used to treat traditional bed nets.

Latest figures from the World Health Organisation (WHO) show that after a dramatic decrease in malaria since the start of the millennium, progress has stalled and the number of people infected with malaria is now going up in some areas, with insecticide-resistant vectors as one of the possible causes of this.

The researchers suggest the use of bed nets with a combination of chemicals should be explored for areas where mosquito resistance is a problem.

The new combination nets used in the study contain a pyrethroid insecticide which repels and kills the mosquitoes as well as an insect growth regulator, pyriproxyfen, which shortens the lives of mosquitoes and reduces their ability to reproduce.

In combination, the ingredients on the nets kill more mosquitoes and reduce the number of infective bites than conventional nets treated only with a pyrethroid.

As it is less likely that mosquitoes become resistant to both chemicals in the combination bed nets, they are considered a better alternative to tackling malaria in areas where mosquitoes have become resistant to the single chemical used in traditional bed nets.

Professor Steve Lindsay, from the Department of Biosciences at Durham University in the UK, said: "This study is important because malaria control in sub-Saharan Africa has stalled, partly because the mosquitoes are adapting and becoming resistant to the pyrethroid insecticides used for treating the old bed nets.

"In our trial in Burkina Faso we tested a new type of net that had a pyrethroid plus an insect growth hormone, which was significantly more protective than the old net type. If we had scaled up our trial to the whole of Burkina Faso we would have reduced the number of malaria cases by 1.2 million.

"Malaria still kills a child every two minutes so we need to keep working to find the best ways to stop this from happening. It is clear that conventional methods used for controlling malaria mosquitoes need to be improved and new additional tools developed."

The latest figures from the World Health Organisation show that in 2016 malaria infected about 216 million people across 91 countries, up five million from the previous year. The disease killed 445,000 which was about the same number as in 2015. The majority of deaths were in children under the age of five in the poorest parts of sub-Saharan Africa.

Burkina Faso, with more than 10 million cases of malaria annually, is one of 20 sub-Saharan countries where malaria increased between 2015 and 2016. Mosquitoes in this area are highly resistant to the traditional insecticide with a dose which is designed to kill 100 per cent of susceptible mosquitoes killing only up to 20 per cent in 2015.

This study is the first clinical trial that has compared a bed net with two active ingredients, a pyrethroid plus an insect growth hormone, against the traditional widely-used nets treated with the pyrethroid insecticide alone.

In this study, conventional bed nets were replaced over time with the new combination nets in 40 rural clusters in Burkina Faso covering 91 villages and involving 1,980 children in 2014 and 2,157 in 2015. The children were aged between six months and five years.

The number of mosquito bites and incidence of clinical malaria in the children in the study were recorded by health clinics and the number of mosquitoes in the houses was tracked through monthly light traps. A number of randomly selected children were visited at home four times and examined clinically for signs of illness. Their blood levels were also tested for possible anaemia.

Principal investigator in the field trial, Dr Alfred B. Tiono, from the Centre National de Recherche et de Formation sur le Paludisme in Burkina Faso, commented: "We have seen our gains in the battle against malaria progressively lost with the emergence and spread of resistant mosquitoes. The results from this trial gave us a new hope.

"This new invaluable tool would enable us to tackle more efficiently this terrible and deadly disease that affects many children. If deployed correctly, we could certainly prevent millions of cases and deaths of malaria. On behalf of our team, we would like to thank our health authorities and the trial participants for helping us towards reaching this major milestone."

Bed nets are crucial to protect people from malaria and the researchers stress that people in affected areas should always sleep under a bed net, whether that is a conventional or a combination type.

Credit: 
Durham University

NASA sees the wind knocked out of Tropical Storm John

image: On Aug. 9, 2018 at 5:10 a.m. EDT (0910 UTC) NASA's Aqua satellite revealed cloud top temperatures in strongest storms were north of John's center of circulation. Those temperatures were as cold as or colder than minus 70 degrees (red) Fahrenheit (minus 56.6 degrees Celsius).

Image: 
NASA/NRL

NASA's Aqua satellite passed over the Eastern Pacific Ocean on Aug. 10 and found that Tropical Storm John had the "wind knocked out of it" as a result of moving over cool waters.

At 11 a.m. EDT on Aug. 10, the National Hurricane Center or NHC issued the final advisory on John as it weakened to a remnant low pressure area.

On Aug. 10. Infrared data from NASA's Aqua satellite analyzed temperature data in the clouds of the former hurricane that showed warm cloud tops and little precipitation.

At 4:45 a.m. EDT (0845 UTC) the Moderate Resolution Imaging Spectroradiometer, or MODIS, instrument aboard NASA's Aqua satellite analyzed Tropical Storm John's cloud top temperatures in infrared light.

MODIS found cloud top temperatures of strongest thunderstorms around Kristy's low-level center were only as cold as or colder than minus 40 degrees Fahrenheit (minus 40 degrees Celsius), indicating low clouds and weak uplift of air.

NHC said "John's convection vanished around 12:30 a.m. EDT (0430 UTC), and the cyclone now consists of a tight swirl of low clouds moving over a 71.6 degrees Fahrenheit (22 degrees Celsius) ocean. Given the lack of convection, the system has been classified as a remnant low." Tropical cyclones need ocean temperatures of at least 80 degrees Fahrenheit (26.6 degrees Celsius) to maintain strength. Colder temperatures can quickly sap a tropical cyclone's strength as was the case with John.

The National Hurricane Center (NHC) noted at 11 a.m. EDT (1500 UTC), the center of Post-Tropical Cyclone John was located near 26.8 degrees north latitude and 121.1 degrees west longitude, about 475 miles (760 km) south-southwest of San Diego.

The post-tropical cyclone is moving toward the northwest near 10 mph (17 kph), and this motion is expected to continue until dissipation occurs in a couple of days. Maximum sustained winds are near 35 mph (55 km/h) with higher gusts. Additional weakening is anticipated.

John may be fading fast but it is still causing ocean swells along the coast. NHC said "Swells generated by John continue to affect portions of the Baja California peninsula and southern California. These swells should begin to gradually subside, but could still cause life-threatening surf and rip current conditions."

Credit: 
NASA/Goddard Space Flight Center

How young people choose their news impacts how they participate in politics

Today's news media landscape consists of more choices than ever before. How young people go about selecting the news they consume in this environment of "information overload" may make a difference in the way they participate in politics, according to new research by a sociology doctoral student at the University of Arizona.

Sam Scovill, who will present the research on Saturday at the American Sociological Association Annual Meeting in Philadelphia, was interested in three primary ways young people, ages 15-25, select what news they consume:

They rely on conventional news sources, such as newspapers and broadcast news in either their traditional or online formats. Scovill refers to this as "elite-selected" media, in which a publisher or producer is choosing which news is presented in mainstream media.

They get their news primarily through social networking sites, like Facebook and Twitter. Scovill refers to this as socially selected news.
They select their news content themselves by actively and critically seeking information on topics that interest them from online-only sources, like YouTube or blogs. They might also subscribe to news updates from those sources.

Scovill looked at how the three different news selection methods impacted young people's engagement in political activities in the following categories: voting, political activism and political campaigning.

Scovill found that study participants age 18 or older who consumed elite-selected media were the most likely to say they voted in the last election, while study participants who intentionally sought out, or self-selected, their media were the most likely to participate in political activism or campaigning.

Getting news from social media did not have a significant impact on political participation in any of the categories examined, although consumers of news on social media were, unsurprisingly, likely to have "liked" a political candidate on Facebook.

Scovill's findings are based on an analysis of the first wave of data from the Youth Participatory Politics dataset, which includes survey responses from a nationally representative sample of 2,920 respondents. The surveying was conducted in 2011 by Knowledge Networks on behalf of Mills College.

While news consumption among young people in the dataset was generally low overall, how they selected their news still proved to make a difference in their political engagement, especially for those who self-selected their news media - which influenced political participation in every category but voting.

Those who self-selected their news were also the most likely to participate in "high-cost" activism and campaigning activities - or those that involve more time, resources or risk of things like judgment by their peers, Scovill said. For example, they were more likely to attend a meeting or a rally for a candidate or issue, or to donate money to a campaign. They also were more likely to sign an online petition or attend a youth political event or protest.

"The overarching pattern was that people who are self-selecting and being intentional about their news consumption are also engaging in these more high-cost forms of activity," Scovill said. "That intentional process matters, whereas news on social media or elite-selected news media are coming through the choices of others who decide what is important to post on Facebook or what is important to go on the front page of the New York Times."

Scovill chose to focus on young people's political engagement not only because teens and young adults are just beginning, or are on the precipice of beginning political participation as adults, but also because young people, as digital natives, have grown up with so many more choices of how to consume news than previous generations.

"Young people have grown up around this, so they have unique news consumption habits and unique skills in navigating the internet and social media and news media online, but they also are inundated with information," Scovill said. "How we choose news is a lot more complicated than it ever has been, and it might actually impact how people are engaging, so we need to be thinking critically about how those news media have implications for the actions that people decide on."

Scovill plans to continue researching young people's political engagement and how it differs from that of generations past, as well how young people's personal identity formation contributes to their political engagement.

"I'm particularly interested in Millennials and Generation Z because they get such a bad rap," Scovill said. "People do a lot of negative talk about them being disengaged and not caring, and while it's true that voting numbers are down, people are engaging differently. Young people are using new forms of activism, like signing petitions online or doing their own crowdsourcing online and raising funds for things that matter to them, in ways that older generations might not be."

Credit: 
University of Arizona

Estimating abundance of key wildlife species using satellites

image: Mountain lions are the most common predator of mule deer in western North American ecosystems; their distribution, abundance, and population trends are closely tied to those of their prey (adult female in the Oquirrh Mountains, Utah)

Image: 
photo by D. Stoner

LOGAN, UTAH, USA- Climate and land-use change are shrinking natural wildlife habitats around the world. Yet despite their importance to rural economies and natural ecosystems, remarkably little is known about the geographic distribution of most wild species - especially those that migrate seasonally over large areas. By combining NASA satellite imagery with wildlife surveys conducted by state natural resources agencies, a team of researchers at Utah State University and the University of Maryland, and the U.S. Geological Survey modeled the effects of plant productivity on populations of mule deer and mountain lions. Specifically, they mapped the abundance of both species over a climatically diverse region spanning multiple western states.

These models provide new insights into how differences in climate are transmitted through the food chain, from plants to herbivores and then to predators. Prey and predator abundance both increased with plant productivity, which is governed by precipitation and temperature. Conversely, animals responded to decreases in food availability by moving and foraging over larger areas, which could lead to increased conflict with humans. David Stoner, lead author of the study, "Climatically driven changes in primary production propagate through trophic levels" published today in the journal Global Change Biology, remarked that, "We expected to see that satellite measurements of plant productivity would explain the abundance of deer. However, we were surprised to see how closely the maps of productivity also predicted the distribution of the mountain lion, their major predator."

The study also reveals a disruption in the way scientists study the biosphere. Joseph Sexton, Chief Scientist of terraPulse, Inc. and a coauthor on the study, described the changing technology, "Up until about a decade ago, we were limited to analyzing landscapes through highly simplified maps representing a single point in time. This just doesn't work in regions experiencing rapid economic or environmental change--the map is irrelevant by the time it's finished." Now, given developments in machine learning, "big data" computation, and the "cloud", ecologists and other scientists are studying large, dynamic ecosystems in ever-increasing detail and resolution. "We're now mining global archives of satellite imagery spanning nearly forty years, we're updating our maps in pace with ecosystem changes, and we're getting that information out to government agencies and private land managers working in the field".

The authors predict that, by enabling land managers to monitor rangeland and agricultural productivity, forest loss and regrowth, urban growth, and the dynamics of wildlife habitat, this expanding stream of information will help humanity adapt to climate and other environmental changes. Stoner noted, "State wildlife agencies are tasked with estimating animal abundance in remote and rugged habitats, which is difficult and expensive. Integration of satellite imagery can help establish baseline population estimates, monitor environmental conditions, and identify populations at risk to climate and land-use change."

Credit: 
S.J. & Jessie E. Quinney College of Natural Resources, Utah State University

Higher alcohol taxes are cost-effective in reducing alcohol harms

PISCATAWAY, NJ - Increasing taxes on alcohol is one of the most cost-effective methods of reducing the harms caused by alcohol consumption, according to research in the new issue of the Journal of Studies on Alcohol and Drugs.

Restrictions on alcohol advertising and hours of sale are also a "best buy" when it comes to reducing hazardous and harmful alcohol use and, by extension, improving overall health in the population.

"Tax increases may not sound the most attractive of policy options but are the single most cost-effective way of diminishing demand and reining back consumption," says lead researcher Dan Chisholm, Ph.D., of the Department of Mental Health and Substance Abuse at the World Health Organization in Geneva, Switzerland.

In the study, researchers from the World Health Organization and one of its academic collaborating centers used a statistical model to determine which of five alcohol control strategies could be a cost-effective public health policy to reduce deaths and harms from alcohol consumption. Previous research has indicated that more than 5 percent of deaths worldwide and over 4 percent of diseases are directly related to alcohol.

A 50 percent hike in alcohol excise taxes--that is, taxes worked into the price of the product and that the consumer might not "see"--would cost less than the equivalent of USD$100 for each healthy year of life gained in the overall population and would add 500 healthy years of life for every 1 million people.

To put that tax increase in perspective, it might represent mere pennies per drink. According to a study in the January issue of the journal, state excise taxes in America average only three cents per 12 oz. beer or 5 oz. glass of wine and only five cents for a drink with 1.5 oz. of hard liquor.

"Current rates of excise taxes on alcohol vary considerably between jurisdictions but can be set very low," Chisholm says, "for example because of low awareness of the risks that alcohol consumption can pose to health or because of strong advocacy from economic operators."

But increasing these rates is "an ambitious but feasible strategy," according to the study, and this change in public policy "would bring excise taxes for alcoholic beverages more in line with those imposed on tobacco products."

Restricting hours of operation for off-premise alcohol retailers or implementing and enforcing strong restrictions/bans on alcohol advertising (on the Internet, radio, television, and billboards) each would also cost less than $100 per healthy year of life gained and would add up to 350 healthy life years for every 1 million people in the population.

Stronger enforcement of blood alcohol concentration laws by increasing the number of sobriety checkpoints would be a somewhat less cost-effective policy: Their model showed it would cost up to $3,000 per healthy year of life saved and would add fewer than 100 years of healthy life per 1 million people. The higher cost would be the result of more time invested by police and the equipment required at checkpoints.

Chisholm and colleagues found that wider use of brief alcohol-problem screening and intervention performed by primary care doctors would generate up to 1,000 years of healthy life per 1 million people, but at a price of up to $1,434 per year of healthy life gained.

The study used data from 16 countries, including upper middle- and high-income countries (such as the United States, Germany, Japan, and China) as well as low- and lower middle-income countries (such as Guatemala, India, Ukraine, and Vietnam).

The report's authors note that they likely underestimated the benefits of improved alcohol control strategies. Their study did not look at effects such as reduced property damage or better productivity at work, among other likely benefits of less overall alcohol consumption in the population.

Nonetheless, not everyone will necessarily think that less alcohol consumption is good policy. "Implementation of these effective public health strategies is actively fought by the alcohol industry, often with threats of lost jobs and/or revenue for countries," the authors write.

In the end, the authors hope their research will "guide decision makers toward a more rational and targeted use of available resources . . . for addressing the substantial and still growing burden of disease attributable to alcohol use."

Credit: 
Journal of Studies on Alcohol and Drugs