Tech

Caution: Grapefruit juice may impose risk on patients with long QT syndrome and should be avoided when taking QT-prolonging drugs

image: These four graphs show deviations of the QTc from baseline during different study days with different interventions. The horizontal axis shows the time (in hours) starting after the third baseline ECG. The vertical axis represents changes in QTc (either prolongation or shortening) from baseline (or delta QTc) following a given intervention. The yellow arrow denotes the timing of administration of oral moxifloxacin after the third baseline electrocardiogram and the pink and red arrows denote repeated administration of grapefruit juice. Panel A compares the delta-QTc during the off-drug day before moxifloxacin (in green) and during the off-drug day before grapefruit (in blue). Note that the two graphs are similar, never reaching a statistically significant difference. Panels B and C show the delta QTc among healthy volunteers after receiving moxifloxacin (yellow line) or grapefruit (red line) and the delta QTc during their respective off-drug study day. Panel D shows the delta QTc of long-QT patients after drinking grapefruit (pink line) and during the off-drug day (light green).

Image: 
<strong>Heart</strong>Rhythm

Philadelphia, May 8, 2019 - Grapefruit juice is already listed as a substance to avoid when taking QT-prolonging medications because it increases the toxicity of many drugs. Investigators have now confirmed the QT-prolonging effects of grapefruit juice in a new study and call for a stronger warning to patients who are taking QT-prolonging drugs or who have long QT syndrome because of the potential risk. They report their findings in HeartRhythm, the official journal of the Heart Rhythm Society and the Cardiac Electrophysiology Society, published by Elsevier.

There are over 200 medications that prolong the QT interval, the time it takes for your heart muscle to recharge between beats. The list includes not only antiarrhythmic drugs, but also medications with no cardiac indications such as some antibiotics, antihistamines, and antipsychotic drugs. These drugs work mainly by blocking a specific "IKr" potassium channel on the myocardial (cardiac muscle) cell membrane, thus prolonging the repolarization in the ventricles of the heart. Abnormalities in the QT interval can also be caused by genetic conditions such as long QT syndrome.

"With so many drugs, of such varied composition blocking the IKr channel, it is reasonable to assume that food compounds may also have IKr-channel-blocker properties, raising the possibility that 'proarrhythmic food' exists," explained Sami Viskin, MD, Department of Cardiology, Tel Aviv Sourasky Medical Center and Sackler School of Medicine, Tel Aviv University, Tel Aviv, Israel, who led the study. "Previous studies showed that flavonoid compounds contained in grapefruit juice have IKr channel- blocking properties. We therefore tested the possibility that grapefruit juice has QT-prolonging properties."

Investigators tested the effects of grapefruit juice on the QT interval following the same stringent criteria used by the pharmaceutical industry to test new drugs before market release. The study was performed according to the "Guidelines for the Clinical Evaluation of QT/QTc for Non-antiarrhythmic Drugs," namely, a randomized crossover design, accurate and blinded QT analysis. As positive control, they used moxifloxacin, an antibiotic with known, albeit small, QT-prolonging properties. Thirty healthy volunteers and ten patients with congenital long QT syndrome participated in a four-day thorough-QT-study. On days one and three, study subjects received no study drugs and underwent multiple electrocardiogram (ECG) recordings to test for the spontaneous daily variability, and spontaneous day-to-day variability of their QT interval. On days two and four, healthy participants received either moxifloxacin (one oral dose of 400 mg) or grapefruit juice (two liters in three divided doses hours apart) in random order. The patients with long QT syndrome received only grapefruit juice.

The study confirmed that grapefruit juice prolongs the QT interval. Among healthy volunteers the net QT prolongation was small, but comparable to that caused by moxifloxacin. The grapefruit-induced QT prolongation was greater in females than in males and more so in patients with congenital long QT syndrome.

Dr. Viskin and colleagues summarize the implications for patient subgroups as follow:

Patients taking cardiac or non-cardiac medications that prolong the QT interval. Grapefruit juice is already listed as a substance to avoid when taking QT-prolonging medications because it inhibits the metabolism of many drugs, increasing their toxicity. The confirmed QT-prolonging effects of grapefruit juice call for an even stronger warning.

Asymptomatic, healthy individuals. The doses of grapefruit juice tested in this study were large (two liters in three divided doses) and the net QT prolongation observed was small. The study does not imply that daily consumption of grapefruit juice involves any measurable risk for the general population. A possible exception could be the consumption of "health drinks" containing highly concentrated grapefruit products.

Patients with congenital or other forms of long QT syndrome. Investigators recommend that these patients should be informed that drinking grapefruit juice in large quantities may impose some risk.

"The net increase in QT interval caused by grapefruit among healthy volunteers was small, but in the range that, if grapefruit juice were a new drug in development, the results of the present study would probably lead the FDA to call for additional studies before issuing a final recommendation based on its expected benefits and risks," commented Dr. Viskin.

Credit: 
Elsevier

Identifying therapeutic targets in sepsis' cellular videogame

LEXINGTON, Ky. (May 8, 2019) -- Sepsis is a medical condition that few patients have heard of and most doctors dread. The body's response to attack by bacteria can trigger a cascade of cellular self-destruction that inadvertently causes blood clots, multi-organ failure, and death.

The immune system functions as a sort of cellular Pac-Man, using white blood cells to hunt out the "bad guys," initiating attacks and counter-attacks. However, in extreme cases, white blood cells commit a sort of hara-kiri, triggering their own death in an attempt to destroy the infection. Sometimes it works -- but when it doesn't, the complications are dangerous.

The arsenal of weapons to treat severe cases of sepsis is miserably small, and physicians have little to provide other than antibiotics, fluids, and hope. Exciting new research has defined the chain of molecular events that goes awry in sepsis, opening up opportunities for new treatments to fight the condition that affects more than a million Americans each year and kills up to a third of them.

Two collaborating laboratories at the University of Kentucky were able to establish the events within white blood cells that progresses from inflammasome activation to a type of programmed cell death called pyroptosis -- and culminates in the damaging blood clots.

"Recent studies have uncovered the mechanism of pyroptosis following inflammasome activation, but we didn't know how pyroptotic cell death drives the disease process," said Zhenyu Li, M.D., Ph.D., an associate professor in the University of Kentucky's Department of Molecular and Cellular Biochemistry.

"If we could uncover that link, it would open up possibilities for therapies that target inflammatory, infection-mediated clotting."

The teams, led by Li and Yinan Wei, Ph.D. of UK's Department of Chemistry, determined that certain bacterial proteins and endotoxin trigger inflammasome activation in white blood cells, causing pyroptosis. During pyroptosis, pores form in the white blood cell membrane that result in the release of tissue factor, a protein known to initiate the clotting process.

"Our data establish inflammasome activation as an important link between inflammation and blood clotting," Li said. "Our findings advance the understanding of the relationship between bacterial infections and coagulation as well as provide evidence that inflammasome may be a potential therapeutic target for sepsis."

Credit: 
University of Kentucky

Avocados, as a substitution for carbohydrates, can suppress hunger without adding calories

Chicago-May 7, 2019 - A new study released by the Center for Nutrition Research at Illinois Institute of Technology suggests that meals that include fresh avocado as a substitute for refined carbohydrates can significantly suppress hunger and increase meal satisfaction in overweight and obese adults.

As rates of obesity in the United States continue to rise, the findings from Illinois Tech suggest that simple dietary changes can have an important impact on managing hunger and aiding metabolic control.

The new research, published in the peer-reviewed journal Nutrients, assessed the underlying physiological effects of including whole and half fresh Hass avocados on hunger, fullness, and how satisfied subjects felt over a six-hour period. Researchers evaluated these effects in 31 overweight and obese adults in a randomized three-arm crossover clinical trial. These dietary changes were also shown to limit insulin and blood glucose excursions, further reducing the risk of diabetes and cardiovascular disease by adding healthy fats and fibers into a regular daily diet.

"For years, fats have been targeted as the main cause of obesity, and now carbohydrates have come under scrutiny for their role in appetite regulation and weight control," said Britt Burton-Freeman, Ph.D., director of the Center for Nutrition Research at Illinois Tech. "There is no 'one size fits all' solution when it comes to optimal meal composition for managing appetite. However, understanding the relationship between food chemistry and its physiological effects in different populations can reveal opportunities for addressing appetite control and reducing rates of obesity, putting us a step closer to personalized dietary recommendations."

The research found that meals including avocado not only resulted in a significant reduction in hunger and an increase in how satisfied participants felt, but also found that an intestinal hormone called PYY was an important messenger of the physiological response.

Credit: 
Illinois Institute of Technology

Americans get 20 percent of their energy from nuclear, but are trapped in the 1950s when it comes to acceptance

In the ongoing effort to decarbonize U.S. energy production, there is one energy source that often attracts great controversy. Nuclear power has been a part of the American energy portfolio since the 1950s and still generates one in every five kilowatt-hours of electricity produced in the country. Still, for a number of reasons, including the association between radiation and cancer, the general public has long felt a significant dread about it.

The clinical and biological significance of HER2 over-expression in breast ductal carcinoma in situ: A large study from a single institution

Breast cancer is the most common cancer in the UK and is responsible for thousands of deaths each year. Ductal carcinoma in situ (DCIS) is a common type of breast cancer that does not spread, but can mutate over time to become metastatic cancer. DCIS is normally removed surgically, however there is a risk of recurrence after treatment, and it is currently difficult to predict this risk when assessing patients.

This study looked at over 860 patients with DCIS to find ways of predicting whether DCIS is likely to return after surgery or develop into metastatic disease. We found that HER2 expression can predict DCIS recurrence and progression. By screening DCIS patients for HER2, we could ensure that high-risk patients are offered treatments to minimize the chance of recurrence or spread, while lower risk DCIS patients could avoid unnecessary treatment.

Credit: 
Cancer Research UK

Apgar scores 'within the normal range' linked to higher risks of illness and death

Apgar scores of 7, 8, and 9 (considered to be within the normal range) are associated with higher risks of illness and even death in newborns, finds a large study from Sweden published by The BMJ today.

The odds of problems are increased with "normal" scores less than 10, but the researchers stress that the risk is still low and certainly lower than for babies with scores outside of the normal range.

The Apgar score is a quick and simple way to assess a baby's condition at birth. The baby is assessed at one, five and 10 minutes after birth on five simple criteria (complexion, pulse rate, reaction when stimulated, muscle tone, and breathing) on a scale from zero to two. The five values are then added up to obtain an overall score from zero to 10.

Scores of less than seven are considered low and are known to carry higher risks of infections and breathing problems, as well as long term conditions such as epilepsy and cerebral palsy.

Scores of 7 to 10 are considered to be "within the normal range" and therefore reassuring. But no study has investigated whether normal scores of 7, 8, or 9 are associated with greater risk of illness or death than a perfect score of 10.

So a research team, led by Dr Neda Razaz at the Karolinska Institutet in Sweden, set out to compare associations between Apgar scores of 7, 8, and 9 (vs 10) with illness and death in newborns.

They analysed data from more than 1.5 million Swedish infants born at full term between 1999 and 2016. Infants with Apgar scores of 7, 8, and 9 at one, five, and 10 minutes after birth were compared with those with an Apgar score of 10 at one, five, and 10 minutes after birth.

After taking account of several factors, such as mother's age, weight (BMI), and smoking during pregnancy, the researchers found that Apgar scores of 7, 8 and 9 at one, five, and 10 minutes after birth were strongly associated with higher risk of infections, breathing problems, brain injury as a result of oxygen deprivation, low blood sugar levels, and death compared with an Apgar score of 10.

For example, compared with a one-minute Apgar score of 10, a one-minute Apgar score of 9 was associated with a 1.5-fold higher odds of infections (0.8 vs 0.5 per 100 births).

At five and 10 minutes, the odds were progressively larger: 2.1-fold (1.7 vs 0.7 infections per 100 births) at 5 minutes, and 3.3-fold (2.9 vs 0.8 infections per 100 births) at 10 minutes.

A small change in Apgar score from 10 at five minutes to 9 at 10 minutes was also associated with increased risk, compared with a stable score of 10 at five and 10 minutes.

This is an observational study, and as such, can't establish cause, and the researchers point to some limitations, such as a lack of information about birth interventions that could influence Apgar scores.

Nevertheless, they say their study included over 1.5 million births over an 18-year period and they were able to account for important factors that could have affected the results.

In summary, the authors say their study shows that low Apgar scores within the normal range (7-10) "are strongly associated with neonatal mortality and morbidity and that these associations are substantially stronger with increasing time after birth."

They add: "Our findings provide strong evidence to support the proposition that the optimal Apgar score is 10 at each time point, and all newborns should be assigned an Apgar score at 10 minutes, regardless of their score at one and five minutes."

Credit: 
BMJ Group

dnDSA and ethnicity linked with thickening of blood vessels after kidney transplant

image: This is Asha Moudgil, M.D., Medical Director, Transplant and senior study author, Children's National.

Image: 
Children's National

VANCOUVER, British Columbia - Children who developed anti-human leukocyte antibodies against their donor kidney, known as de novo donor-specific antibodies (dnDSA), after kidney transplant were more likely to experience carotid intima-media thickening (CIMT) than those without these antibodies, according to preliminary research presented May 7, 2019, during the 10th Congress of the International Pediatric Transplant Association.

dnDSA play a key role in the survival of a transplanted organ. While human leukocyte antibodies protect the body from infection, dnDSA are a major cause of allograft loss. CIMT measures the thickness of the intima and media layers of the carotid artery and can serve as an early marker of cardiac disease.

Emerging evidence links dnDSA with increased risk of accelerated systemic hardening of the arteries (arteriosclerosis) and major cardiac events in adult organ transplant recipients. However, this phenomenon has not been studied extensively in children who receive kidney transplants.

To investigate the issue, Children's researchers enrolled 38 children who had received kidney transplants and matched them by race with 20 healthy children. They measured their CIMT, blood pressure and lipids 18 months and 30 months after their kidney transplants. They monitored dnDSA at 18 months and 30 months after kidney transplant. The transplant recipients' median age was 11.3 years, 50 percent were African American, and 21% developed dnDSA.

"In this prospective controlled cohort study, we compared outcomes among patients who developed dnDSA with transplant recipients who did not develop dnDSA and with race-matched healthy kids," says Kristen Sgambat, Ph.D., a pediatric renal dietitian at Children's National who was the study's lead author. "Children with dnDSA after transplant had 5.5% thicker CIMT than those who did not have dnDSA. Being African American was also independently associated with a 9.2% increase in CIMT among transplant recipients."

Additional studies will need to be conducted in larger numbers of pediatric kidney transplant recipients to verify this preliminary association, Sgambat adds.

10th Congress of the International Pediatric Transplant Association presentation:

* "Circulating de novo donor-specific antibodies and carotid intima-media thickness in pediatric kidney transplant recipients."

Kristen Sgambat, Ph.D., pediatric renal dietitian and study lead author; Sarah Clauss, M.D., cardiologist and study co-author; and Asha Moudgil, M.D., Medical Director, Transplant and senior study author, all of Children's National.

Credit: 
Children's National Hospital

Study shows adult tourniquet suitable for school-age children

WILMINGTON, Del. (May 7, 2019) - Researchers with Nemours Children's Health System have shown the effectiveness of an adult tourniquet for use in children, according to a study published today by the journal Pediatrics. While developed for adults, the military's Combat Application Tourniquet (CAT) is effective in controlling blood flow in children's arms and legs, as measured by Doppler pulse, in 100 percent of cases involving upper extremities and 93 percent for lower extremities. This is the first, prospective study on the device's use in children. Past anecdotal, retrospective reports from international warzones have indicated the CAT is being used in pediatric trauma cases.

"Firearm injuries and death are unfortunately not uncommon, and we need an effective tool for treating extremity hemorrhage in children in traumatic situations. Tourniquets have the potential to save lives from gunshot injuries since a severely injured child could bleed to death before medical help can arrive." said H. Theodore (Ted) Harcke, MD, lead author of the study and physician and researcher at Nemours/Alfred I. duPont Hospital for Children. Dr. Harcke is also a retired US Army Colonel and serves as forensic radiologist for the Armed Forces Medical Examiner System. "Our data shows that the tourniquet used by the military is easy to apply and suitable for use in the school age population."

In the study, the Nemours research team applied a CAT to an upper arm and thigh of 60 volunteer participants, aged six to 16 years, and monitored their pulse using vascular Doppler ultrasound. The tourniquet was applied according to the manufacturer's guidelines. The study sample was reflective of U.S. school populations. Participants included 36 boys and 24 girls, with body mass index (BMI) ranging from underweight to obese.

The Pediatric Trauma Society supports tourniquet use for life-threatening hemorrhage caused by extremity trauma. Additionally, Stop the Bleed, an initiative of the American College of Surgeons and the Hartford Consensus, is currently instructing school staff, faculty, and students how to use tourniquets. However, since tourniquets generally are not designed for children, the authors' chief concern was the safety and effectiveness of use in younger children with smaller limbs.

The protocol allowed no more than three turns of the tourniquet windlass, to avoid pain to participants. The three-turn maximum allowed by the protocol was enough for all upper extremities and all but three lower extremities. Three turns did not completely arrest the pulse in three older, obese subjects (BMI > 30) who were adult-sized. The team anticipates that additional windlass turns, as used in actual trauma care, would stop blood flow in an injured lower limb in these cases.

The study's greatest impact may be on pre-hospital care. However, the researchers note its relevance to all pediatricians, who should be familiar with tourniquet use in children to ensure that development of guidelines for training and application are appropriate and medically correct.

Credit: 
Nemours

New open source software eases the pain of multiple UI designs

The time-consuming and labour-intensive task of designing multiple user interfaces for different screen sizes and orientations could become a thing of the past thanks to open-source software that uses a new paradigm to speed up or even automate the process.

The ORC Layout (OR-constraint Layout) software is being launched at the ACM CHI Conference on Human Factors in Computing Systems in Glasgow, Scotland, today (7 May 2019). The software, a collaboration between the University of Bath, University of Maryland and Simon Fraser University offers a new approach to UI design based on flexible principles to intelligently suggest layouts for different screens.

Currently a user interface (UI) has to be built for every different type of screen, such as desktop, tablet and mobile phone, as well as the orientations - portrait and landscape, which is not only very time consuming, but increases the chances of errors creeping in as it becomes hard to keep track of changes and iterations.

Some software already exists to help automate this process, but both existing approaches have severe limitations.

The first, traditional constraint-based layout, applies rigid rules to design, for instance always placing one icon below another. However this rigidity can cause problems when changing size and orientation, leading to ugly or confusing UI.

The second method is flow layout, whereby components of the screen design can automatically move into new rows or columns as space runs out. But it is limited in the way that alignment of components can be specified.

ORC Layout merges the strengths of these two approaches by allowing designers to use all the features of traditional constraint-based layout and flow layout together and specify flexible alternatives for UI components and widgets. For example, designers can specify which widgets are essential and which are optional. By using Boolean logic, ORC Layout can automatically suggest intuitive alternative layouts for different screens.

Dr Christof Lutteroth from the University of Bath's Department of Computer Science, who worked on the software, said: "ORC Layout can be applied to any device, to any platform, and the idea is really very simple: if there's no space for the toolbar at the top of the screen, why not put it onto the left of the screen or the bottom of the screen? This is exactly what designers have to do when thinking about different screens.

"In our new layout method we bring all these alternatives together. For instance a designer can start with a desktop screen and design it the way they want, then start marking elements as optional if there's not enough space, looking at what happens when screen is rotated.

"By putting all the alternatives together, no matter what kind of device, ORC can automatically figure out what the best alternatives are for you. It's really quite exciting as it really changes the design process to make it simpler, but also reduces potential for problems - such as forgetting to update one version of a UI."

The team sees the software as having widespread applications in modern web design, document formatting and app layouts.

The research is published in CHI Conference on Human Factors in Computing Systems Proceedings. The team now wants to continue to work on optimising the ORC Layout software, including by reducing the computing power it requires.

The ORC layout team will present the research at ACM CHI Conference on Human Factors in Computing Systems in Glasgow, Scotland, today at 11AM.

Credit: 
University of Bath

Collision-detecting suitcase, wayfinding app help blind people navigate airports

video: Researchers at Carnegie Mellon University, the University of Tokyo and Waseda University in Tokyo have developed a smart suitcase, called BBeep, that helps blind travelers navigate through crowds.

Image: 
Carnegie Mellon University

PITTSBURGH--Carnegie Mellon University researchers say a smart suitcase that warns blind users of impending collisions and a wayfinding smartphone app can help people with visual disabilities navigate airport terminals safely and independently.

The rolling suitcase sounds alarms when users are headed for a collision with a pedestrian, and the navigation app provides turn-by-turn audio instructions to users on how to reach a departure gate -- or a restroom or a restaurant. Both proved effective in a pair of user studies conducted at Pittsburgh International Airport.

The researchers will present their findings at CHI 2019, the Association for Computing Machinery's Conference on Human Factors in Computing Systems, May 4-9 in Glasgow, Scotland.

CMU and Pittsburgh International Airport are partners in developing new systems and technologies for enhancing traveler experiences and airport operations.

"Despite recent efforts to improve accessibility, airport terminals remain challenging for people with visual impairments to navigate independently," said Chieko Asakawa, IBM Distinguished Service Professor in CMU's Robotics Institute and an IBM Fellow at IBM Research. Airport and airline personnel are available to help them get to departure gates, but they usually can't explore and use the terminal amenities as sighted people can.

"When you get a five- or six-hour layover and you need to get something to eat or use the restrooms, that is a major hassle," said one legally blind traveler who participated in a focus group as part of the research. "It would be lovely to be able to get up and move around and do things that you need to do and maybe want to do."

An increasing number of airports have been installing Bluetooth beacons, which can be used for indoor navigation, but often they are deployed to enhance services for sighted travelers, not to help blind people, said Kris Kitani, assistant research professor in the Robotics Institute.

He and his colleagues deployed NavCog, a smartphone-based app that employs Bluetooth beacons, at Pittsburgh International Airport. The app, developed by CMU and IBM to help blind people navigate independently, previously has been deployed on campuses, including CMU, and in shopping malls. They modified it for use at the airport, where extremely wide corridors make users vulnerable to veering, and for use with moving walkways. As part of the project, the airport installed hundreds of Bluetooth beacons throughout the facility.

"Part of our commitment to the public includes making sure our airport works for everyone, particularly as we modernize our facility for the future," said Pittsburgh International Airport CEO Christina Cassotis. "We're proud to partner with such great researchers through Carnegie Mellon University. Having that world-class ingenuity reflected at our airport is emblematic of Pittsburgh's transformation."

The app gives audio directions to users. It relies on a map of the terminal that has been annotated with the locations of restrooms, restaurants, gates, entrances and ticketing counters.

Ten legally blind people tested the app using an iPhone 8 with good results, traversing the terminal's large open spaces, escalators and moving walkways with few errors. Most users were able to reach the ticketing counter in three minutes, traverse the terminal in about six minutes, go from the gate to a restroom in a minute and go from the gate to a restaurant in about four minutes.

The NavCog app for iPhone is available for free from the App Store and can be used at Pittsburgh International in the ticketing area of the landside terminal and in the concourses and center core of the airside terminal.

Another team, including researchers from the University of Tokyo and Waseda University in Tokyo, developed the smart suitcase, called BBeep, to help with another problem encountered in airports -- navigating through crowds. The assistive system has a camera for tracking pedestrians in the user's path and can calculate when there is a potential for collision.

"Sighted people will usually clear a path if they are aware of a blind person," said Asakawa, who has been blind since age 14. "This is not always the case, as sighted people may be looking at their smartphone, talking with others or facing another direction. That's when collisions occur."

BBeep helps clear a path. A rolling suitcase itself can help clear the way and can serve as an extended sensing mechanism for identifying changes in floor texture. BBeep, however, can also sound an alarm when collisions are imminent -- both warning the user and alerting people in the area, enabling them to make room. A series of beeps begins five seconds before collision. The frequency of the beeps increases at 2.5 seconds. When collision is imminent, BBeep issues a stop sound, prompting the blind user to halt immediately.

In tests at the airport, six blind participants each wheeled BBeep with one hand and used a white cane in the other as they maneuvered through crowded areas. They were asked to walk five similar routes in three modes -- one where the suitcase gave no warnings, another in which the warnings could only be heard by the user through a headset and another in which warnings were played through a speaker. A researcher followed each participant to make sure no one was injured.

The researchers said the speaker mode proved most effective, both in reducing the number of pedestrians at risk of imminent collision and in reducing the number of pedestrians in the user's path.

"People were noticing that I was approaching and people were moving away ... giving me a path," one user observed.

Credit: 
Carnegie Mellon University

Show your hands: Smartwatches sense hand activity

image: Researchers at Carnegie Mellon University's Human-Computer Interaction Institute (HCII) have used a standard smartwatch to figure out when a wearer was typing on a keyboard, washing dishes, petting a dog, playing piano or a number of other hand activities.

Image: 
Carnegie Mellon University

PITTSBURGH--We've become accustomed to our smartwatches and smartphones sensing what our bodies are doing, be it walking, driving or sleeping. But what about our hands? It turns out that smartwatches, with a few tweaks, can detect a surprising number of things your hands are doing.

Researchers at Carnegie Mellon University's Human-Computer Interaction Institute (HCII) have used a standard smartwatch to figure out when a wearer was typing on a keyboard, washing dishes, petting a dog, pouring from a pitcher or cutting with scissors.

By making a few changes to the watch's operating system, they were able to use its accelerometer to recognize hand motions and, in some cases, bio-acoustic sounds associated with 25 different hand activities at around 95 percent accuracy. And those 25 activities are just the beginning of what might be possible to detect.

"We envision smartwatches as a unique beachhead on the body for capturing rich, everyday activities," said Chris Harrison, assistant professor in the HCII and director of the Future Interfaces Group. "A wide variety of apps could be made smarter and more context-sensitive if our devices knew the activity of our bodies and hands."

Harrison and HCII Ph.D. student Gierad Laput will present their findings on this new sensing capability at CHI 2019, the Association for Computing Machinery's Conference on Human Factors in Computing Systems, May 4-9 in Glasgow, Scotland. A video is available.

Just as smartphones now can block text messages while a user is driving, future devices that sense hand activity might learn not to interrupt someone while they are doing certain work with their hands, such as chopping vegetables or operating power equipment, Laput said. Sensing hand activity also lends itself to health-related apps -- monitoring activities such as brushing teeth, washing hands or smoking a cigarette.

Hand-sensing also might be used by apps that provide feedback to users who are learning a new skill, such as playing a musical instrument, or undergoing physical rehabilitation. Apps might alert users to typing habits that could lead to repetitive strain injury (RSI), or assess the onset of motor impairments such as those associated with Parkinson's disease.

Laput and Harrison began their exploration of hand activity detection by recruiting 50 people to wear specially programmed smartwatches for almost 1,000 hours while going about their daily activities. Periodically, the watches would record hand motion, hand orientation and bio-acoustic information, and then prompt the wearer to describe the hand activity -- shaving, clapping, scratching, putting on lipstick, etc. More than 80 hand activities were labeled in this way, providing a unique dataset.

For now, users must wear the smartwatch on their active arm, rather than the passive (non-dominant) arm where people typically wear wristwatches, for the system to work. Future experiments will explore what events can be detected using the passive arm.

"The 25 hand activities we evaluated are a small fraction of the ways we engage our arms and hands in the real world," Laput said. Future work likely will focus on classes of activities -- those associated with specific activities such as smoking cessation, elder care, or typing and RSI.

Credit: 
Carnegie Mellon University

Experimental device generates electricity from the coldness of the universe

image: A drawback of solar panels is that they require sunlight to generate electricity. Some have observed that for a device on Earth facing space, the chilling outflow of energy from the device can be harvested using the same kind of optoelectronic physics we have used to harness solar energy. New work, in Applied Physics Letters, looks to provide a potential path to generating electricity like solar cells but that can power electronics at night. This is a schematic of the experimental infrared photodiode that has generated electricity directly from the coldness of space.

Image: 
Masashi Ono

WASHINGTON, D.C., May 6, 2019 -- The obvious drawback of solar panels is that they require sunlight to generate electricity. Some have observed that for a device on Earth facing space, which has a frigid temperature, the chilling outflow of energy from the device can be harvested using the same kind of optoelectronic physics we have used to harness solar energy. New work, in a recent issue of Applied Physics Letters, from AIP Publishing, looks to provide a potential path to generating electricity like solar cells but that can power electronics at night.

An international team of scientists has demonstrated for the first time that it is possible to generate a measurable amount of electricity in a diode directly from the coldness of the universe. The infrared semiconductor device faces the sky and uses the temperature difference between Earth and space to produce the electricity.

"The vastness of the universe is a thermodynamic resource," said Shanhui Fan, an author on the paper. "In terms of optoelectronic physics, there is really this very beautiful symmetry between harvesting incoming radiation and harvesting outgoing radiation."

In contrast to leveraging incoming energy as a normal solar cell would, the negative illumination effect allows electrical energy to be harvested as heat leaves a surface. Today's technology, though, does not capture energy over these negative temperature differences as efficiently.

By pointing their device toward space, whose temperature approaches mere degrees from absolute zero, the group was able to find a great enough temperature difference to generate power through an early design.

"The amount of power that we can generate with this experiment, at the moment, is far below what the theoretical limit is," said Masashi Ono, another author on the paper.

The group found that their negative illumination diode generated about 64 nanowatts per square meter, a tiny amount of electricity, but an important proof of concept, that the authors can improve on by enhancing the quantum optoelectronic properties of the materials they use.

Calculations made after the diode created electricity showed that, when atmospheric effects are taken into consideration, the current device can theoretically generate almost 4 watts per square meter, roughly one million times what the group's device generated and enough to help power machinery that is required to run at night.

By comparison, today's solar panels generate 100 to 200 watts per square meter.

While the results show promise for ground-based devices directed to the sky, Fan said the same principle could be used to recover waste heat from machines. For now, he and his group are focusing on improving their device's performance.

Credit: 
American Institute of Physics

Huntington drug successfully lowers levels of disease-causing protein

An international clinical trial has found that a new drug for Huntington disease is safe, and that treatment with the drug successfully lowers levels of the abnormal protein that causes the debilitating disease in patients.

In a study published today in the New England Journal of Medicine, researchers from UBC and their colleagues have demonstrated for the first time that the drug, IONIS-HTTRX (now known as RO7234292) successfully lowered levels of the mutant huntingtin protein--the toxic protein that causes Huntington disease--in the central nervous system of patients.

"This is a tremendously exciting and promising result for patients and families affected by this devastating genetic brain disorder," said Dr. Blair Leavitt, neurologist and director of research at the Centre for Huntington Disease at UBC. "For the first time, we have evidence that a treatment can not only decrease levels of the toxic disease-causing protein in patients, but that it is also safe and very well tolerated."

Leavitt, who is also a senior scientist at the Centre for Molecular Medicine and Therapeutics in the UBC faculty of medicine, treated all the Canadian participants in this study, including the first patient enrolled in the study in September 2015.

Huntington disease (HD) is a fatal genetic neurological disease. It usually develops in adulthood and causes abnormal involuntary movements, psychiatric symptoms and dementia. About one in 10,000 people in Canada have HD. To date, no effective treatments have been proven to slow down progression of this disorder. HD is caused by a single known genetic mutation, and each child of a carrier of the mutation has a 50 per cent chance of inheriting the disease.

The trial enrolled 46 patients with early HD at nine study centres in Canada, the United Kingdom, and Germany. Of the 46 patients, 34 were randomized to receive the drug and 12 were randomized to receive placebo. Each participant received four doses of the drug and all study participants completed the study and have continued to receive the active drug in an on-going open-label study. The drug was administered monthly to patients via an injection directly into the cerebrospinal fluid.

The researchers, led by Dr. Sarah Tabrizi, director of the Huntington Disease Centre at University College London and global chief investigator of the IONIS-HTTRX clinical trial, found that the drug produced significant decreases in the levels of mutant huntingtin protein in the patients' cerebrospinal fluid. None of the patients experienced any serious adverse reactions, suggesting that the treatment is safe and well tolerated by patients.

The drug is currently being evaluated in a large phase three multi-center clinical trial being performed at the Centre for Huntington Disease at UBC and other HD centres around the world. This study is designed to determine whether the treatment slows or halts the progression of disease symptoms.

Credit: 
University of British Columbia

Technology could help reduce exploitation of traditional weavers in Malaysia

image: This is a traditional Songket weaver.

Image: 
Universiti Teknologi MARA

New smartphone apps and greater use of social media could help reduce the exploitation of traditional weavers in poor rural regions of Malaysia, new research suggests.
An interdisciplinary team of researchers, including experts in human-computer interaction, information management, and English and creative writing, studied the supply chain of the songket fabric market in the Malaysian state of Terengganu. The researchers, who are supported by 'Digital Threads: Towards personalized craft production in Malay cottage industries', funded by AHRC UK, believe the use of new, social technology could help weavers connect more directly with customers, reducing the need to deal exclusively with merchants.

Songket is the traditional Malay fabric worn at special occasions, such as weddings and parties. A simple piece of songket can take a skilled weaver a month to make, with more elaborate designs taking much longer.

Many songket weavers work from home in isolated rural villages and they are often commissioned on an exclusive basis by merchants from large towns or cities. These merchants deal directly with the customers and also receive most of the profits from the sale of the garments.

The weavers, who are overwhelmingly women, often have limited education levels, lack ownership of their raw materials or equipment, have limited welfare provision, and are often only paid subsistence wages.

"Our findings indicate that weavers are invisible in both the physical world, due to their remoteness to customers, and the digital world because their relationships with customers are predominantly mediated by their merchants," said Professor Corina Sas, of Lancaster University and co-author of the research. "Weavers have limited awareness of their vulnerable position because of their longstanding relationships with merchants, which for some has been built over several generations".

"Therefore, despite their exploitative nature, these relationships are, in fact, consensual and perceived as beneficial by most weavers."

The researchers, who captured and have written the stories of rural weavers so that these can be shared on online platforms used by weavers and prospective customers, point out that new designs of digital technology could help weavers to transition to selling their wares directly.

"Technological solutions will increase weavers' visibility in the market, and they will learn of the less exploitative transactions available, such as weaving for their own customers," said Dr Min Zhang, of Lancaster University and co-author of the research. "However gaining independence will take time and therefore, to ensure no loss of wages, the new solutions should co-exist, for a while, alongside the current exploitative relationships."

The research, which is to be presented at the prestigious computing academic conference CHI 2019, in Glasgow, highlights the opportunities available for computing experts to design new platforms for a transforming sangket supply chain and for its different social layers - which include customer, designers, merchants and weavers.

Credit: 
Lancaster University

Weather extremes explain up to 43% of global crop yield variations

image: Continental regions for staple crops.

Image: 
Supplied by author

Researchers from Australia, Germany and the US have quantified the effect of climate extremes, such as droughts or heatwaves, on the yield variability of staple crops around the world.

Overall, year-to-year changes in climate factors during the growing season of maize, rice, soy and spring wheat accounted for 20%-49% of yield fluctuations, according to research published in Environmental Research Letters.

Climate extremes, such as hot and cold temperature extremes, drought and heavy precipitation, by themselves accounted for 18%-43% of these interannual variations in crop yield.

To get to the bottom of the impacts of climate extremes on agricultural yields, the researchers used a global agricultural database at high spatial resolution, and near-global coverage climate and climate extremes datasets. They applied a machine-learning algorithm, Random Forests, to tease out which climate factors played the greatest role in influencing crop yields.

"Interestingly, we found that the most important climate factors for yield anomalies were related to temperature, not precipitation, as one could expect, with the average growing season temperature and temperature extremes playing a dominant role in predicting crop yields," said lead author Dr. Elisabeth Vogel from the Centre of Excellence for Climate Extremes and Climate & Energy College at the University of Melbourne.

The research also revealed global hotspots - areas that produce a large proportion of the world's crop production, yet are most susceptible to climate variability and extremes.

"We found that most of these hotspots - regions that are critical for overall production and at the same time strongly influenced by climate variability and climate extremes - appear to be in industrialised crop production regions, such as North America and Europe."

For climate extremes specifically, the researchers identified North America for soy and spring wheat production, Europe for spring wheat and Asia for rice and maize production as hotspots.

But, as the researchers point out, global markets are not the only concern. Outside of these major regions, in regions where communities are highly dependent on agriculture for their livelihoods, the failure of these staple crops can be devastating.

"In our study, we found that maize yields in Africa showed one of the strongest relationships with growing season climate variability. In fact, it was the second highest explained variance for crop yields of any crop/continent combination, suggesting that it is highly dependent on climate conditions," Dr Vogel said.

"While Africa's share of global maize production may be small, the largest part of that production goes to human consumption - compared to just 3% in North America - making it critical for food security in the region."

"With climate change predicted to change the variability of climate and increasing the likelihood and severity of climate extremes in most regions, our research highlights the importance of adapting food production to these changes," Dr Vogel said.

"Increasing the resilience to climate extremes requires a concerted effort at local, regional and international levels to reduce negative impacts for farmers and communities depending on agriculture for their living."

Credit: 
University of New South Wales