Tech

Dial-a-frog -- researchers develop the 'FrogPhone' to remotely call frogs in the wild

image: Researchers Anke Marie Hoefer and Adrian Garrido Sanchis test the FrogPhone

Image: 
Marta Yebra Alvarez

Researchers have developed the 'FrogPhone', a novel device which allows scientists to call up a frog survey site and monitor them in the wild. The FrogPhone is the world's first solar-powered remote survey device that relays environmental data to the observer via text messages, whilst conducting real-time remote acoustic surveys over the phone. These findings are presented in the British Ecological Society Journal Methods in Ecology and Evolution today.

The FrogPhone introduces a new concept that allows researchers to "call" a frog habitat, any time, from anywhere, once the device has been installed. The device has been developed at the University of New South Wales (UNSW) Canberra and the University of Canberra in collaboration with the Australian Capital Territory (ACT) and Region Frogwatch Program and the Australian National University.

The FrogPhone utilises 3G/4G cellular mobile data coverage and capitalises on the characteristic wideband audio of mobile phones, which acts as a carrier for frog calls. Real time frog calls can be transmitted across the 3G/4G network infrastructure, directly to the user's phone. This supports clear sound quality and minimal background noise, allowing users to identify the calls of different frog species.

"We estimate that the device with its current microphone can detect calling frogs from a 100-150m radius" said lead author Dr. Adrian Garrido Sanchis, Associate Lecturer at UNSW Canberra. "The device allows us to monitor the local frog population with more frequency and ease, which is significant as frog species are widely recognised as indicators of environmental health" said the ACT and Region Frogwatch coordinator and co-author, Anke Maria Hoefer.

The FrogPhone unifies both passive acoustic and active monitoring methods, all in a waterproof casing. The system has a large battery capacity coupled to a powerful solar panel. It also contains digital thermal sensors to automatically collect environmental data such as water and air temperature in real-time. The FrogPhone uses an open-source platform which allows any researcher to adapt it to project-specific needs.

The system simulates the main features of a mobile phone device. The FrogPhone accepts incoming calls independently after three seconds. These three seconds allow time to activate the temperature sensors and measure the battery storage levels. All readings then get automatically texted to the caller's phone.

Acoustic monitoring of animals generally involves either site visits by a researcher or using battery-powered passive acoustic devices, which record calls and store them locally on the device for later analysis. These often require night-time observation, when frogs are most active. Now, when researchers dial a device remotely, the call to the FrogPhone can be recorded indirectly and analysed later.

Ms. Hoefer remarked that "The FrogPhone will help to drastically reduce the costs and risks involved in remote or high intensity surveys. Its use will also minimize potential negative impacts of human presence at survey sites. These benefits are magnified with increasing distance to and inaccessibility of a field site."

A successful field trial of the device was performed in Canberra from August 2017 to March 2018. Researchers used spectrograms, graphs which allow the visual comparison of the spectrum of frequencies of frog signals over time, to test the recording capabilities of the FrogPhone.

Ms. Hoefer commented that "The spectrogram comparison between the FrogPhone and the standard direct mobile phone methodology in the lab, for the calls of 9 different frog species, and the field tests have proven that the FrogPhone can be successfully used as a new alternative to conduct frog call surveys."

The use of the current FrogPhone is limited to areas with adequate 3G/4G phone coverage. Secondly, to listen to frogs in a large area, several survey devices would be needed. In addition, it relies on exposure to sunlight.

Future additions to the FrogPhone could include a satellite communications module for poor signal areas, or the use of multidirectional microphones for large areas. Lead author Garrido Sanchis emphasized that "In densely vegetated areas the waterproof case of the FrogPhone allows the device to be installed as a floating device in the middle of a pond, to maximise solar access to recharge the batteries".

Dr. Garrido Sanchis said "While initially tested in frogs, the technology used for the FrogPhone could easily be extended to capture other animal vocalisation (e.g. insects and mammals), expanding the applicability to a wide range of biodiversity conservation studies".

Credit: 
British Ecological Society

Infant morbidity decreases with incentive-based prenatal tobacco interventions

Colorado is taking a critical step to protect low-income women during their pregnancy through incentive-based smoking cessation interventions. A new study from the Colorado School of Public Health at the Anschutz Medical Campus shows a significant reduction in infant morbidity due to the program.

The study, published in Public Health Nursing, examines the results of the interventions provided by the Baby & Me Tobacco Free program (BMTF) throughout Colorado.

"Young women, especially when raised in low-income households, are a vulnerable target for tobacco use," said Tessa Crume, PhD, MSPH, associate professor at the Colorado School of Public Health, and lead researcher in the study.

In Colorado, smoking in the third trimester of pregnancy is three to four times higher among women who live in poverty relative to women with higher incomes, according to Colorado Pregnancy Risk Assessment Monitoring System 2012-2014. Smoking during pregnancy is the most substantial modifiable risk factor for infant morbidity and mortality in the United States.

"The problem of prenatal smoking will not go away, especially when tobacco products target a younger generation, and nicotine addictions begin before becoming pregnant. This study is important because successful interventions improve the health of mothers and children, disrupt familial propagation of tobacco use while also saving Coloradans millions in healthcare costs," Crume said.

The BMTF intervention includes counseling (based on motivational interviewing) provided throughout the pregnancy and postpartum period, biomonitoring feedback via carbon-monoxide breath testing and financial incentives in the form of diaper vouchers contingent on cessation-status.

Key findings from the study include:

Reduction in infant morbidity: BMTF participants had a 24% to 28% reduction in the risk of preterm birth and a 24% to 55% reduction in the risk of neonatal intensive care unit (NICU) admissions.

Significant cost savings: Cost savings per participant in BMTF compared to the birth certificate population was $6,040 and Pregnancy Risk Assessment Monitoring System (PRAMS) reference was $2,182.

Total annual cost savings for Colorado associated with the BMTF intervention was $4,144,118 and $1,497,299 compared to the birth certificate and PRAMS reference populations, respectively. Costs for each adverse maternal delivery and birth outcome were based on the average Medicaid reimbursement for a delivery complicated with the outcome, minus the average reimbursement for an uncomplicated delivery.

Based on an extrapolation estimate: if the BMTF program covered all Colorado Medicaid recipients who smoked in the three months prior to or during pregnancy in Colorado, the state of Colorado would save an estimate between 16.8 and 6 million dollars annually on healthcare costs associated with adverse smoking-related birth outcomes.

To assess birth outcomes and cost-savings of the program, the study compared BMTF participants to two state-level referent populations: a birth certificate reference population consisting of over 16,000 women and the Pregnancy Risk Assessment Monitoring System reference population that reflects a response of over 16,000 women.

Supporting quotes:

Jill Hunsaker Ryan, executive director, Colorado Department of Public Health and Environment (CDPHE)
"It is very hard to quit smoking. Most women who smoke during pregnancy try to quit, and programs such as Baby and Me Tobacco Free are proven to help them succeed. This translates to healthier moms and babies, saving money on health care for all Coloradans."

Laurie Adams, executive director, Baby & Me Tobacco Free Program

"As we look at the 22 States nationally implementing BMTF, we applaud the excellent work Colorado is doing to reduce the burden of tobacco on the pregnant population. The timely research proves that our efforts are making a difference. The additional component of this published report into cost savings is phenomenal. We not only are helping each woman and baby enrolled in BMTF, we are saving millions in health care dollars in Colorado. As we celebrate the great findings, we hope it will spur greater interest in scale and sustainability efforts in Colorado and in the nation. We congratulate everyone involved in this effort, from the BMTF facilitators in the field, to the evaluators/researchers at UCD, to state leadership at CDPHE, to Rocky Mountain Health Foundation and we look forward to our continued partnership."

Lisa Fenton Free, director, Baby & Me Tobacco Free Program, Rocky Mountain Health Foundation

"Baby & Me Tobacco Free Program continues to prove year after year its ability to improve the lives of mothers, babies, and even family members as well. To prevent preterm deliveries, low birth weights, and improve their overall health is impactful immediately and for years to come."

Credit: 
University of Colorado Anschutz Medical Campus

Electronic map reveals 'rules of the road' in superconductor

image: This band structure map for a single crystal of iron selenide is akin to a road map that describes how traffic rules change for electrons as the material cools and the crystal lattice changes shape, becoming elongated in one direction. The same data are represented in the top and bottom panels. The blue areas (top) show where electrons can travel as they traverse the energy landscape in iron selenide that's been cooled near the point of superconductivity. Paths to the left of center are at right angles to the paths right of center. Thanks to nematicity, the allowable paths for electrons are different in the two directions. Colored lines (bottom) show the paths of electrons in different orbitals. Superconductivity in iron selenide is associated with this "symmetry-broken" state, and mapping the electronic structure of the state could lead to improved theoretical understanding of the phenomenon.

Image: 
Image courtesy of M. Yi/Rice University

HOUSTON -- (Dec. 6, 2019) -- Using a clever technique that causes unruly crystals of iron selenide to snap into alignment, Rice University physicists have drawn a detailed map that reveals the "rules of the road" for electrons both in normal conditions and in the critical moments just before the material transforms into a superconductor.

In a study online this week in the American Physical Society journal Physical Review X (PRX), physicist Ming Yi and colleagues offer up a band structure map for iron selenide, a material that has long puzzled physicists because of its structural simplicity and behavioral complexity. The map, which details the electronic states of the material, is a visual summary of data gathered from measurements of a single crystal of iron selenide as it was cooled to the point of superconductivity.

"It's very much like a traffic map," said Yi, who joined Rice's faculty and Center for Quantum Materials (RCQM) early this year. "In the quantum mechanical world, there are all kinds of interesting rules electrons must obey as they move through a crystal. Some materials have fast lanes and slow lanes, and electrons can only cross and change lanes under special circumstances. Other materials have traffic circles where U-turns are forbidden because electrons can move counterclockwise but not clockwise."

Yi began the angle-resolved photoemission spectroscopy experiments for the study during a postdoctoral stint at the University of California, Berkeley. The technically challenging experiments used powerful synchrotron light from the Stanford Synchrotron Radiation Lightsource (SSRL) to coax the crystal to emit electrons.

"In a sense, these measurements are like taking photographs of electrons that are flying out of the material," she said. "Each photograph tells the lives the electrons were living right before being kicked out of the material by photons. By analyzing all the photos, we can piece together the underlying physics that explains all of their stories."

Red-light cameras for electrons

The electron detector tracked both the speed and direction that electrons were traveling when emitted from the crystal. That information contained important clues about the quantum mechanical laws that dictated the traffic patterns at a larger, microscopic scale, where key aspects of superconductivity are believed to arise.

"The fun aspect of our experimental work is making observations of different types of exotic materials and figuring out the quantum mechanical rules that govern electron behavior in those materials," she said. "Ultimately, we hope those rules will guide physicists to new classes and new kinds of materials, hopefully with amazing properties that improve the world."

These rules are encoded in a material's electronic structure, Yi said.

"They're like an electronic fingerprint of a material," she said. "Each material has its own unique fingerprint, which describes the allowed energy states electrons can occupy based on quantum mechanics. The electronic structure helps us decide, for example, whether something will be a good conductor or a good insulator or a superconductor."

When things go sideways

Electrical resistance is what causes wires, smartphones and computers to heat up during use, and it costs billions of dollars each year in lost power on electric grids and cooling bills for data centers. Superconductivity, the zero-resistance flow of electricity, could eliminate that waste, but physicists have struggled to understand and explain the behavior of unconventional superconductors like iron selenide.

Yi was in graduate school when the first iron-based superconductors were discovered in 2008, and she's spent her career studying them. In each of these, an atom-thick layer of iron is sandwiched between other elements. At room temperature, the atoms in this iron layer are arranged in checkerboard squares. But when the materials are cooled near the point of superconductivity, the iron atoms shift and the squares become rectangular. This change brings about direction-dependent behavior, or nematicity, which is believed to play an important but undetermined role in superconductivity.

"Iron selenide is special because in all of the other iron-based materials, nematicity appears together with magnetic order," Yi said. "If you have two orders forming together, it is very difficult to tell which is more important, and how each one affects superconductivity. In iron selenide, you only have nematicity, so it gives us a unique chance to study how nematicity contributes to superconductivity by itself."

Performing under pressure

The upshot of nematicity is that the traffic patterns of electrons -- and the quantum rules that cause the patterns -- may be quite different for electrons flowing right-to-left, along the long axis of the rectangles, than for the electrons flowing up-and-down along the short axis. But getting a clear look at those traffic patterns in iron selenide has been challenging because of twinning, a property of the crystals that causes the rectangles to randomly change orientation by 90 degrees. Twinning means that long-axis rectangles will run left-to-right about half of the time and up-and-down the other half.

Twinning in iron selenide made it impossible to obtain clear, whole-sample measurements of nematic order in the material until Rice physicists Pengcheng Dai and Tong Chen published a clever solution to the problem in May. Building on a detwinning technique developed by Dai and colleagues in 2014, Chen found he could detwin fragile crystals of iron selenide by gluing them atop a sturdier layer of barium iron arsenide and turning a screw to apply a bit of pressure. The technique causes all the nematic layers in the iron selenide to snap into alignment.

Dai and Chen were co-authors on the PRX paper, and Yi said the detwinning technique was key to getting clear data about the impact of nematicity on iron selenide's electronic behavior.

"This study would not have been possible without the detwinning technique that Pengcheng and Tong developed," Yi said. "It allowed us to take a peek at the arrangements of electronic states as the material system gets ready for superconductivity. We were able to make precise statements about the availability of electrons belonging to different orbitals that could participate in superconductivity when nematic rules have to be obeyed."

A path forward

Yi said the data show that the magnitude of nematic shifts in iron selenide are comparable to the shifts measured in more complicated iron-based superconductors that also feature magnetic order. She said that suggests the nematicity that's observed in iron selenide could be a universal feature of all iron-based superconductors, regardless of the presence of long-range magnetism. And she hopes that her data allow theorists to explore that possibility and others.

"This set of measurements will provide precise guidance for theoretical models that aim to describe the nematic superconducting state in iron-based superconductors," she said. "That's important because nematicity plays a role in bringing about superconductivity in all of these materials."

Credit: 
Rice University

New instrument extends LIGO's reach

image: A close-up of the quantum squeezer which has expanded LIGO's expected detection range by 50 percent.

Image: 
Image: Maggie Tse

Just a year ago, the National Science Foundation-funded Laser Interferometer Gravitational-wave Observatory, or LIGO, was picking up whispers of gravitational waves every month or so. Now, a new addition to the system is enabling the instruments to detect these ripples in space-time nearly every week.

Since the start of LIGO's third operating run in April, a new instrument known as a quantum vacuum squeezer has helped scientists pick out dozens of gravitational wave signals, including one that appears to have been generated by a binary neutron star -- the explosive merging of two neutron stars.

The squeezer, as scientists call it, was designed, built, and integrated with LIGO's detectors by MIT researchers, along with collaborators from Caltech and the Australian National University, who detail its workings in a paper published in the journal Physical Review Letters.

What the instrument "squeezes" is quantum noise -- infinitesimally small fluctuations in the vacuum of space that make it into the detectors. The signals that LIGO detects are so tiny that these quantum, otherwise minor fluctuations can have a contaminating effect, potentially muddying or completely masking incoming signals of gravitational waves.

"Where quantum mechanics comes in relates to the fact that LIGO's laser is made of photons," explains lead author Maggie Tse, a graduate student at MIT. "Instead of a continuous stream of laser light, if you look close enough it's actually a noisy parade of individual photons, each under the influence of vacuum fluctuations. Whereas a continuous stream of light would create a constant hum in the detector, the individual photons each arrive at the detector with a little 'pop.'"

"This quantum noise is like a popcorn crackle in the background that creeps into our interferometer, and is very difficult to measure," adds Nergis Mavalvala, the Marble Professor of Astrophysics and associate head of the Department of Physics at MIT.

With the new squeezer technology, LIGO has shaved down this confounding quantum crackle, extending the detectors' range by 15 percent. Combined with an increase in LIGO's laser power, this means the detectors can pick out a gravitational wave generated by a source in the universe out to about 140 megaparsecs, or more than 400 million light years away. This extended range has enabled LIGO to detect gravitational waves on an almost weekly basis.

"When the rate of detection goes up, not only do we understand more about the sources we know, because we have more to study, but our potential for discovering unknown things comes in," says Mavalvala, a longtime member of the LIGO scientific team. "We're casting a broader net."

The new paper's lead authors are graduate students Maggie Tse and Haocun Yu, and Lisa Barsotti, a principal research scientist at MIT's Kavli Institute for Astrophysics and Space Research, along with others in the LIGO Scientific Collaboration.

Quantum limit

LIGO comprises two identical detectors, one located at Hanford, Washington, and the other at Livingston, Louisiana. Each detector consists of two 4-kilometer-long tunnels, or arms, each extending out from the other in the shape of an "L."

To detect a gravitational wave, scientists send a laser beam from the corner of the L-shaped detector, down each arm, at the end of which is suspended a mirror. Each laser bounces off its respective mirror and travels back down each arm to where it started. If a gravitational wave passes through the detector, it should shift one or both of the mirrors' position, which would in turn affect the timing of each laser's arrival back at its origin. This timing is something scientists can measure to identify a gravitational wave signal.

The main source of uncertainty in LIGO's measurements comes from quantum noise in a laser's surrounding vacuum. While a vacuum is typically thought of as a nothingness, or emptiness in space, physicists understand it as a state in which subatomic particles (in this case, photons) are being constantly created and destroyed, appearing then disappearing so quickly they are extremely difficult to detect. Both the time of arrival (phase) and number (amplitude) of these photons are equally unknown, and equally uncertain, making it difficult for scientists to pick out gravitational-wave signals from the resulting background of quantum noise.

And yet, this quantum crackle is constant, and as LIGO seeks to detect farther, fainter signals, this quantum noise has become more of a limiting factor.

"The measurement we're making is so sensitive that the quantum vacuum matters," Barsotti notes.

Putting the squeeze on "spooky" noise

The research team at MIT began over 15 years ago to design a device to squeeze down the uncertainty in quantum noise, to reveal fainter and more distant gravitational wave signals that would otherwise be buried the quantum noise.

Quantum squeezing was a theory that was first proposed in the 1980s, the general idea being that quantum vacuum noise can be represented as a sphere of uncertainty along two main axes: phase and amplitude. If this sphere were squeezed, like a stress ball, in a way that constricted the sphere along the amplitude axis, this would in effect shrink the uncertainty in the amplitude state of a vacuum (the squeezed part of the stress ball), while increasing the uncertainty in the phase state (stress ball's displaced, distended portion). Since it is predominantly the phase uncertainty that contributes noise to LIGO, shrinking it could make the detector more sensitive to astrophysical signals.

When the theory was first proposed nearly 40 years ago, a handful of research groups tried to build quantum squeezing instruments in the lab.

"After these first demonstrations, it went quiet," Mavalvala says.

"The challenge with building squeezers is that the squeezed vacuum state is very fragile and delicate," Tse adds. "Getting the squeezed ball, in one piece, from where it is generated to where it is measured is surprisingly hard. Any misstep, and the ball can bounce right back to its unsqueezed state."

Then, around 2002, just as LIGO's detectors first started searching for gravitational waves, researchers at MIT began thinking about quantum squeezing as a way to reduce the noise that could possibly mask an incredibly faint gravitational wave signal. They developed a preliminary design for a vacuum squeezer, which they tested in 2010 at LIGO's Hanford site. The result was encouraging: The instrument managed to boost LIGO's signal-to-noise ratio -- the strength of a promising signal versus the background noise.

Since then, the team, led by Tse and Barsotti, has refined its design, and built and integrated squeezers into both LIGO detectors. The heart of the squeezer is an optical parametric oscillator, or OPO -- a bowtie-shaped device that holds a small crystal within a configuration of mirrors. When the researchers direct a laser beam to the crystal, the crystal's atoms facilitate interactions between the laser and the quantum vacuum in a way that rearranges their properties of phase versus amplitude, creating a new, "squeezed" vacuum that then continues down each of the detector's arm as it normally would. This squeezed vacuum has smaller phase fluctuations than an ordinary vacuum, allowing scientists to better detect gravitational waves.

In addition to increasing LIGO's ability to detect gravitational waves, the new quantum squeezer may also help scientists better extract information about the sources that produce these waves.

"We have this spooky quantum vacuum that we can manipulate without actually violating the laws of nature, and we can then make an improved measurement," Mavalvala says. "It tells us that we can do an end-run around nature sometimes. Not always, but sometimes."

Credit: 
Massachusetts Institute of Technology

Bats may benefit from wildfire

image: A female Yuma myotis is in flight pursuing a moth.

Image: 
Michael Durham/Minden Pictures, Bat Conservation International

Bats face many threats - from habitat loss and climate change to emerging diseases, such as white-nose syndrome. But it appears that wildfire is not among those threats, suggests a study from the University of California, Davis, published today in the journal Scientific Reports. It found that bats in the Sierra Nevada appear to be well-adapted to wildfire.

The researchers used acoustic surveys to test the effects of burn severity and variation in fire effects, or pyrodiversity, on 17 species of bats in the region. Individual species responded to wildfire differently, but overall species richness increased from about eight species in unburned forests to 11 species in forests that experienced moderate- to high-severity burns.

"Bats rely on forests for a number of resources," said lead author Zack Steel, a postdoctoral researcher with UC Davis and UC Berkeley who conducted the study as a UC Davis doctoral candidate. "The key is recognizing that natural fire is useful to them because it creates a variety of habitat conditions. They are adapted to it. Many species seem to actually benefit from fire."

TOO DENSE

Many forest bats are adapted to dense spaces, while others are associated with open habitats. Researchers were surprised to find species from both groups preferred burned forests to unburned or minimally burned forests. Steel thinks this is because decades of fire suppression created uncharacteristically dense forests.

"Our forests are now so dense that even clutter-tolerant bats are preferring burned areas," he said. "There are big areas of forests that haven't seen fire in a century. When fires do occur, they create openings for these species."

These openings are entryways for bats to better find insects to eat. Dead trees or snags even provide roosting habitat for some bats.

The study lends support to the practice of prescribed burns and managed wildfire where lightning-caused fires are allowed to burn in remote areas in Sierra Nevada forests.

SOUND ACOUSTICS

As nocturnal animals with no audible call or song, bats are relatively understudied. They use echolocation, in which they emit a sound humans cannot hear. They listen as the call bounces off an object, marking its location.

To learn how wildfire is affecting bat habitat, the research team used acoustic surveying technology with ultrasonic microphones to track echolocation patterns, which vary among species. The recordings were converted into spectrograms, or visualizations of bat calls, that allowed scientists to identify the species present. Then they compared bat occurrence rates to habitat conditions.

The study areas included California forests in the Sierra Nevada Mountains affected by the 2013 Rim Fire, 2004 Power Fire and 2012 Chips Fire.

Steel said that with recent dramatic changes in wildfire patterns in the Sierra Nevada, shifts in the composition of species are likely underway, but "not everything is losing."

Credit: 
University of California - Davis

Study finds little increased risk of injury in high-intensity functional training program

ROCHESTER, Minn. -- High-intensity group workout classes are increasingly popular at fitness centers. While research has shown that these workouts can have cardiovascular and other benefits, few studies have been conducted on whether they lead to more injuries.

A Mayo Clinic study that closely tracked 100 participants in a six-week high-intensity functional training program showed a statistically insignificant increase in the rate of injury, compared with less intensive workouts.

The study, published in Mayo Clinic Proceedings, reported an injury rate of 9 injuries per 1,000 training hours during the six-week training, compared with 5 injuries per 1,000 training hours during the six weeks preceding enrollment. The data showed that 18% of participants reported an injury during the training period, and 37.5% reported an injury during a training session.

"These types of classes, which can include ballistic movements such as throwing or jumping with weights, and resistance training with kettlebells or free weights, have become very popular, but other than studies of similar programs in military training, there are no prospective research studies on injuries that can occur in these classes," says Edward Laskowski, M.D., co-director of Mayo Clinic Sports Medicine. "Our findings show a trend toward an increase in injury during the course of a typical class."

"Emphasizing proper technique and movement patterns is very important in all exercise, especially strength training," says Dr. Laskowski, the study's corresponding author. Most injuries in this study were related to movements that are ballistic or have an increased risk of injury if not performed with optimal technique.

"Though not statistically significant, the injury rate was almost three times the rate reported in previous studies," he says. "Hopefully, these results will provide a stimulus for interventions, including focusing on optimal technique and eliminating exercises that are higher-risk when not performed correctly, that will help reduce the risk of injury."

The study also emphasizes the importance of participants informing the trainer of preexisting injuries or medical conditions, monitoring fatigue during the workout, and modifying or eliminating exercises that put an individual at risk.

The research involved 100 adults, 82% of whom were female, who participated in a high-intensity functional training class at Mayo Clinic's Dan Abraham Healthy Living Center from January 2017 to April 2018. One-hour group workout classes were held weekly for six weeks, and participants completed a survey before and after the classes ended. Participants had the advantage of a small instructor-to-participant ratio, which allowed for closer monitoring of technique and movement.

Injuries were self-reported, and the most common injuries were to the back and knees. Burpees and squats were the most common movements causing injury, according to the study.

The growing popularity of high-intensity group workout programs should lead to more study of benefits and risks, Dr. Laskowski says, with the goal of further reducing the risk of injury. "The United

States is in the midst of an epidemic of obesity and sedentary lifestyle. Programs that promote physical activity, such as high-intensity functional training, can help to mitigate the effects of this epidemic and provide the motivation needed to get people moving."

Credit: 
Mayo Clinic

Recruitment of miR-8080 by luteolin inhibits AR-V7 in castration-resistant prostate cancer

image: MiR-8080 recruited by luteolin enhances the chemotherapeutic effect of enzalutamide in 22Rv1 CRPC tumor (A) MiR-8080 can bind the 3'-untranslated region of AR-V7. (B-D) Effect of luteolin (Lut) on the chemotherapeutic efficacy of enzalutamide (Enz) in 22Rv1 xenografts in castrated nude mice. (B) Tumor volumes of 22Rv1 xenografts. (C) Quantitative miR-8080 expression by qRT-PCR in 22Rv1 xenografts. (D) Western blotting analysis for AR-V7 in 22Rv1 xenografts.

Image: 
© Aya Naiki-Ito

Prostate cancer is the most common noncutaneous malignancy in the United States and is responsible for many male deaths. The development of prostate carcinogenesis is initially androgen-dependent.

However, the progression of castration-resistant prostate cancer (CRPC) following androgen deprivation therapy is a major clinical problem. Although enzalutamide and abiraterone have been approved for CRPC hormone therapy, the efficacy of these drugs is limited. Androgen receptor splice variant 7 (AR-V7) which lacks a functional ligand-binding domain, stand out as one of a major contributor of cell proliferation and therapeutic resistance in CRPC.

In the present study, Dr. Aya Naiki-Ito (Associate professor, Nagoya City University), Dr. Satoru Takahashi (Professor, Nagoya City University) and their collaborators investigated the chemopreventive and chemotherapeutic potential of luteolin, a flavonoid with anti-oxidative properties, on prostate cancer, including CRPC. Luteolin inhibited the progression of rat prostate carcinogenesis by induction of apoptosis in a transgenic rat for adenocarcinoma of prostate (TRAP) model. Luteolin decreased cell proliferation in a dose-dependent manner and induced apoptosis with the activation of caspases-3 and 7 in both rat (PCai1) and human (22Rv1) CRPC cells. Dietary luteolin also suppressed tumor growth via an increase in apoptosis and inhibition of angiogenesis in PCai1 and 22Rv1 xenografts implanted in castrated nude mice. Luteolin dramatically suppressed AR-V7 protein expression in 22Rv1 cells in vitro and ex vivo. Microarray analysis identified MiR-8080, which contains a possible target sequence for AR-V7 3'UTR, as a gene up-regulated by luteolin. MiR-8080 transfection decreased the AR-V7 expression level and the induction of apoptosis in 22Rv1 cells. Furthermore, miR-8080 knock-down canceled luteolin decreasing AR-V7 and the cell growth of 22Rv1.

Finally, we confirmed the effect of luteolin on the chemotherapeutic efficacy of enzalutamide against CRPC in 22Rv1 t xenografts in castrated nude mice. Enzalutamide-only treatment did not affect the tumor growth of 22Rv1. However, miR-8080 induced by luteolin intake down-regulated AR-V7 and greatly enhanced the therapeutic effect of enzalutamide on 22Rv1 tumors.

In conclusion, luteolin suppresses both the early stage of prostate carcinogenesis and CRPC via the induction of apoptosis. MiR-8080 recruited by luteolin supplementation has an important role in the reduction of AR-V7 protein, resulting in inhibiting tumorigenesis and the enzalutamide resistance of CRPC. Therefore, miR-8080 may be a novel therapeutic target for CRPC.

Credit: 
Nagoya City University

Multiple correlations between brain complexity and locomotion pattern in vertebrates

image: High-definition 3D reconstructions of whole-brains in the facultative bipedal brown basilisk lizard (Basiliscus vittatus, left panel) and golden flying snake (Chrysopelea ornata, right panel), highlighting the morphological variation in cerebellar architecture associated with locomotor specialization.

Image: 
Simone Macrì and Nicolas Di-Poï, University of Helsinki.

Researchers at the Institute of Biotechnology, University of Helsinki, have uncovered multi-level relationships between locomotion - the ways animals move - and brain architecture, using high-definition 3D models of lizard and snake brains.

The new study unveils the existence of multiple correlations between brain complexity and locomotion pattern in vertebrates, indicating that locomotion mode is a strong predictor of cerebellar size, shape, neuron organization, and gene expression levels. This demonstrates the existence of specific type of brain shared by animals with lifestyle or behavior similarities.

"The cerebellum is a major component of the brain that contributes to coordination, precision, and accurate timing of movement, and the diversity of this brain region is remarkable across vertebrates", describes Principal Investigator Nicolas Di-Poï, Associate Professor at the Institute of Biotechnology, University of Helsinki.

Research studies have previously shown that behavioral and ecological factors such as diet, habitat, locomotion, cognitive abilities and lifespan play an important role in driving animal brain evolution. However, comparative studies have so far largely focused on brain size measurements, and the ecological relevance of potential multi-level variations in brain morphology and architecture had remained unclear until now.

Researchers from the University of Helsinki hypothesized that in addition to expected morphological changes in limb and skeletal structures, the ways animals move from one place to another could be a strong predictor of brain complexity at various levels of biological organization, including size, shape, neuron organization and gene expression pattern.

Based on contrast-enhanced computed tomography technology and high-resolution manual segmentation, "we present here one of the first sets of high-definition 3D reconstructions of whole-brains in vertebrates", says the first author of the study, PhD candidate Simone Macrì from the University of Helsinki.

To test this hypothesis, the research group used squamate reptiles - lizards and snakes - as the main animal model because of their high levels of morphological diversity and unique behavioral features. One major challenge the group faced was to collect a representative panel of more than 100 reptile specimens with different locomotor modes, ranging from small worm-like limbless species digging and living underground to four-limbed species with facultative bipedal or flying capabilities. Such effort has involved active collaborations with museums, personal breeders and collaborators.

Credit: 
University of Helsinki

Scientist identify new marker for insecticide resistance in malaria mosquitoes

image: Anopheles gambiae mosquito showing Gal4 directed ubiquitous expression of red fluorescent protein

Image: 
Adriana Adolfi

Researchers at LSTM have genetically modified malaria carrying mosquitoes in order to demonstrate the role of particular genes in conferring insecticide resistance.

For the first time the team characterised three genes (Cyp6m2, Cyp6p3 and Gste2) most often associated with insecticide resistance directly by their overproduction in genetically modified Anopheles gambiae. LSTM's Dr Gareth Lycett is senior author on a paper published today in the journal PNAS. He explained: "Malaria is resurging again in Africa as mosquitoes become highly resistant to insecticides that are used to create bednets and spray household surfaces."

To help find the causes of this increased resistance first author, Dr Adriana Adolfi, and her colleagues at LSTM generated genetically modified mosquitoes that overproduce specific enzymes that previous work at LSTM had identified as potential candidates in this process of acquiring insecticide resistance. This breakthrough work has found that increased production of just these three genes can between them cause the mosquitoes to become resistant to all four classes of public health insecticides currently being used in malaria control.

Dr Lycett continued: "These data validate the particular genes as excellent markers for resistance, giving us much needed tools to monitor the growing problem effectively through molecular testing. The super-resistant mosquitoes generated are now also being used test new insecticides to find compounds that escape the activity of these enzymes and so can successfully to be incorporated in next generation bednets to again provide effective protection for users and the wider community."

Credit: 
Liverpool School of Tropical Medicine

Graphene takes off in composites for planes and cars

The Graphene Flagship brought together top European researchers and companies to discuss the most disruptive ways graphene could enhance composites used in the aerospace, automotive and energy industries. The multidisciplinary team involved researchers from academic institutions, business enterprises such as Graphene Flagship Partners Nanesa and Avanzare, and large transportation end-user industries, such as Graphene Flagship Partners Airbus and Fiat. They showed that integrating graphene and related materials (GRMs) into fibre-reinforced composites (FRCs) has great potential to improve weight and strength, and helps to overcome the bottlenecks limiting the applications of these composites in planes, cars, wind turbines and more. Nowadays, the transportation industry is responsible for nearly one-third of global energy demand, and it is the major source of pollution and greenhouse gas emissions in urban areas. Graphene Flagship scientists are therefore continually trying to develop new materials to lower fuel usage and CO2 emissions, helping to mitigate environmental damage and climate change.

Graphene-integrated composites are an example of lighter materials with great potential for use in vehicle frameworks. They are constructed by introducing graphene sheets, a few billionths of a metre thick, into hierarchical fibre composites as a nano-additives. Hierarchical fibre composites are a type of composite material in which components of different sizes are combined in a controlled way to significantly improve the mechanical properties. They typically consist of micro- or mesoscopic carbon fibres, a few millionths of a metre thick, attached to a polymer matrix, and they are already used as building materials to make vehicles of all shapes and sizes.

Graphene's high aspect ratio, high flexibility and mechanical strength enable it to enhance the strength of weak points in these composites, such as at the interface between two different components. Its tunable surface chemistry also means that interactions with the carbon fibre and polymer matrix can be adjusted as needed. The fibre, polymer matrix and graphene layers all work together to distribute mechanical stress, resulting in a material with improved strength and other beneficial properties.

There are many challenges to consider. For instance, planes experience temperature changes between 20 °C and -40 °C every time they take off and land, with huge differences in pressure and humidity. Graphene-integrated composites therefore need to withstand water condensing and even freezing inside the fuselage. They also need to endure lightning strikes, which happen several times per month, so the conductive properties of graphene must be harnessed to create an electrically conductive framework that resists electromagnetic impulses. In cars, new structural materials must be able to withstand crash tests and be lightweight enough to ensure fuel efficiency. Graphene Flagship researchers are also investigating conductive materials to replace circuitry in car dashboards.

Researchers and end-users come together

Graphene Flagship partners at Queen Mary University and the National Graphene Institute, UK, FORTH-Hellas, Greece, CNR, Italy, and Chalmers University of Technology, Sweden, collaborated with researchers at the University of Turin, the University of Trento and KET-LAB, Italy, and the University of Patras, Greece, to provide perspectives from the research community. They worked with scientists at Graphene Flagship partner companies Nanesa, Italy, and Avanzare, Spain, to review the technological viability of graphene-incorporated FRCs.

Francesco Bertocchi, co-author of the paper and President of Nanesa, believes that graphene-incorporated FRCs are indeed feasible for vehicle design, and has created new composites with many essential properties for the transportation industries. "Thanks to the Graphene Flagship, Nanesa has worked in close synergy with many partners to create many different prototypes. These include properties such as flame retardancy, water vapor absorption barrier, high electrical and thermal conductivity, EMI shielding. We also integrated thermo-resistive systems for de-icing and anti-icing ," he says.

Graphene Flagship Partners Airbus and Fiat-Chrysler Automobiles, world leading aerospace and automotive industries, evaluated the impact of graphene-incorporated FRCs on the aerospace and automotive industries and assessed their commercial viability.

Tamara Blanco-Varela, co-author and materials & processes engineer at Airbus, explains that Airbus is working hard to make these materials viable for use in new aircraft models. "We all know that the aeronautical sector is very challenging for the introduction of new materials or technologies. Airbus is committed to making graphene-related materials fly as soon as possible, and a step-by-step approach is being set up," she says. By selecting 'quick-win' applications with immediate benefits to the aerospace industry, she anticipates that graphene-integrated FRCs will reach the market soon.

"One example is using these materials for anti- and de-icing purposes in aeroplanes, for which Airbus will be leading activities targeting commercial exploitation of this technology. We are hoping for it to reach a high maturity level, with a target readiness level between five and six, in the next few years."

Brunetto Martorana, co-author and researcher at Graphene Flagship partner Fiat-Chrysler Automobiles, adds: "The interesting structural properties of graphene have opened an interesting window for designing novel light composites." He explains that new lightweight composite materials do not necessarily need to be lower in strength and introduce safety issues. "New approaches must be found to enhance the 'crashworthiness' of composites - and graphene composites may be able to fill that role," he continues. Fiat-Chrysler Automobiles have now committed to the commercialization of new composite materials, and will be leading a new initiative to bring this technology to market."

An uplifting outlook

"The Graphene Flagship provides a stable, clear, long-lasting partnership for different partners to work together. They all started their collaboration as part of our Composites Work Package", comments Vincenzo Palermo, Graphene Flagship Vice-Director and lead author of the paper. "The Graphene Flagship pushes all partners to have frequent interactions, with regular meetings - like in this case, partners who begun working on graphene with different motivations have come together to address common challenges," he says.

Galiotis, the Graphene Flagship's Composites Work Package leader, expresses that this collaboration has been highly valuable. "This a comprehensive review of the work undertaken in the Graphene Flagship, and elsewhere, to confirm that the addition of GRMs provides benefits to many applications in the aerospace, automotive, energy and leisure industries." Galiotis expresses particular interest in the review's analysis of the best ways to process GRMs into composites, the effect of this on the overall composite performance, and the challenges scientists face in the search for high performance composites. "Overall, I think this is a timely review article for the composites field, which should be read with interest by all parties involved with composite development and usage," he concludes.

Andrea C. Ferrari, Science and Technology Officer of the Graphene Flagship and Chair of its Management Panel, comments: "This paper shows the leadership of large corporations and small enterprises, all partners of the Graphene Flagship, in taking graphene composites to the market in the next few years. This yet again shows the steady progress of the Graphene Flagship along its technology and innovation roadmap."

Credit: 
Graphene Flagship

Scientists see defects in potential new semiconductor

COLUMBUS, Ohio-- A research team has reported seeing, for the first time, atomic scale defects that dictate the properties of a new and powerful semiconductor.

The study, published earlier this month in the journal Physical Review X, shows a fundamental aspect of how the semiconductor, beta gallium oxide, controls electricity.

"Our job is to try to identify why this material, called beta gallium oxide, acts the way it acts at the fundamental level," said Jared Johnson, lead author of the study and a graduate research associate at The Ohio State University Center for Electron Microscopy and Analysis. "It is important to know why this material has the properties it has, and how it acts as a semiconductor, and we wanted to look at it at the atomic level -- to see what we could learn."

Scientists have known about beta gallium oxide for about 50 years, but only in the last several years has it become an intriguing option for engineers looking to build more reliable, more efficient high-powered technologies. The material is especially well-suited for devices used in extreme conditions, such as in the defense industry. The team has been studying beta gallium oxide for its potential to provide high-density power.

For this study, the CEMAS team, overseen by Jinwoo Hwang, assistant professor of materials science and engineering, examined beta gallium oxide under a powerful electron microscope, to see the way the material's atoms interacted. What they saw confirmed a theory first hypothesized about a decade ago by theorists: Beta gallium oxide has a form of imperfection in its structure, something the team refers to as "point defects," which are unlike any defects previously seen in other materials.

Those defects matter: For example, they could be places where electricity could be lost in transit among electrons. With proper manipulation, the defects can also provide opportunities for unprecedented control of the material's properties. But understanding the defects must come before we learn how to control them.

"It is very meaningful that we could actually directly observe these point defects, these abnormalities within the crystal lattice," Johnson said. "And these point defects, these oddballs within the lattice structure, lower the energy stability of the structure."

A lower energy stability means that the material might have some flaws that need addressing in order to conduct electricity efficiently, Johnson said, but they don't mean beta gallium oxide would not necessarily be a good semiconductor. The defects can in fact behave favorably to conduct electricity - if scientists can control them.

"This material has very good properties for those high-powered technologies," he said. "But it is important that we're seeing this on the fundamental level -- we're almost understanding the science behind this material and how it works, because this defect, these abnormalities, could affect the way it functions as a semiconductor."

Credit: 
Ohio State University

Between arousal and inhibition

image: Granule cells (blue) process and encipher information and assemble it into a kind of map in the dentate gyrus.

Image: 
<i>Nature Communications</i>

The dentate gyrus is the "input point" for the hippocampus part of the brain. It transmits information from the short term memory to the long term. It consists of granule cells, which are especially dense in this area of the brain, and interneurons, which are linked up in the central or peripheral nervous system between several nerve cells and have an inhibiting effect on their activity. Both types of cell process information and differentiate closely-related memories. A team headed by Prof. Dr. Marlene Bartos from the Institute of Physiology I of the University of Freiburg, which also includes lead author Dr. Claudio Elgueta, has found why granule cells and interneurons process incoming signals differently: they have fundamentally different structures and functional characteristics. The work group has published its results in the journal Nature Communications.

The dendritic appendages of the nerve cells receive incoming signals rather similarly to antennae. Interneurons can be severely inhibited by chloride transporters, which enhance the inhibitory signals, and the high density of GABAA receptors. These interneurons do not directly process information but determine which granule cells are involved in information processing. Granule cells on the other hand have lower densities of GABAA receptors and are mildly inhibited: they process and encipher signals from the environment which results in a map-like representation of the environment in the dentate gyrus.

If the degree of inhibition in the brain cells changes, malfunctions can arise in the processing, enciphering and accessing of information. This can impair memory function and lead to neurological disorders. These results are a contribution to a better understanding of the mechanisms of information processing in the central nervous system.

Since 2018 Marlene Bartos and her team have received an Advanced Grant of 2.5 million euros from the European Research Council (ERC).

Credit: 
University of Freiburg

Squid pigments have antimicrobial properties

image: Diagram with the stages of the study. The main ommochrome identified in squid skin is xanthommatin.

Image: 
Chan-Higuera et al.

Ommochromes, the pigments that colour the skin of squids and other invertebrates, could be used in the food and health sectors for their antioxidant and antimicrobial properties. This is confirmed by the analyses carried out by researchers from the University of Sonora in Mexico and the Miguel Hernández University in Spain.

One of the squid's best-known characteristics is its ability to change its colour to blend in with the environment, which it does by contracting or relaxing skin cells known as chromatophores.

Now, scientists at the Miguel Hernandez University in Elche (Spain) and the University of Sonora in Mexico have discovered that these cells contain a type of violet pigment, called ommochrome, which has antimicrobial, antioxidant and antimutagenic properties.

"When we add extracts of these squid pigments to bacterial cultures like Listeria monocytogenes, Salmonella enterica, Staphylococcus aureus or Haemophilus influenzae, or to fungi like Candida albicans, the growth of microorganisms is inhibited," says Jesús Enrique Chan, a researcher at the two universities and co-author of the study.

"Moreover," he adds, "ommochromes act as antioxidants, binding or 'sequestering' metals and eliminating such radicals as an active oxygen called singlet and superoxide anions. We've identified the best temperature, time and proportion conditions with solvents so as to obtain its greater antioxidant and antimicrobial activity, as well as its antimutagenic capacity against agents such as aflatoxin B1 (a mutagenic mycotoxin)."

The study, published in the Journal of Microbiology, Biotechnology and Food Sciences, concludes that cephalopod ommochromes are the components responsible for all these beneficial properties and highlights: "They are pigments with a promising therapeutic value, which could be applied in future in both the food and health sectors."

In order to carry out their research, the authors have used samples of the giant squid (Dosidicus gigas), a species that is caught on the Pacific coast of America and which is mainly used for its mantle and tentacles. The rest, including its skin, is regarded as fishing waste and dumped into the sea.

"This generates pollution problems in the coasts," says Chan, "so research like this, in which we inform about how these wastes could be used, helps to revalue them and minimise their dumping into the environment."

The main ommochrome identified by scientists is xanthommatin, a pigment found in other common squid (Loligo vulgaris, Doryteuthis pealeii...), cuttlefish (Seppia officinalis) octopus (Octopus vulgaris) and other invertebrate species, from which this beneficial compound could also be extracted.

Credit: 
Spanish Foundation for Science and Technology

The Lancet: First long-term estimates suggest link between cholesterol levels and risk of heart disease and stroke

Study is the most comprehensive analysis of long-term risk for cardiovascular disease related to non-high-density lipoprotein (non-HDL) cholesterol - including almost 400,000 people from 19 countries who were followed for up to 43.5 years (median 13.5 years follow-up) between 1970 to 2013.

This longer-term evidence may be particularly important in people aged under 45 years.

Depending on cholesterol level and number of cardiovascular risk factors, men and women aged under 45 years have a 12-43% or 6-24% risk (respectively) of having fatal or non-fatal heart disease or stroke by the age of 75 years.

If non-HDL cholesterol levels were halved, women and men younger than 45 years with starting levels of non-HDL cholesterol between 3.7-4.8 mmol/litre and who had two additional cardiovascular risk factors could reduce their risk from around 16% to 4%, and from around 29% to 6%, respectively.

The most comprehensive analysis of its kind suggests that there is a strong link between non-HDL cholesterol levels and long-term risk for cardiovascular disease in people aged under 45 years, not just at older ages.

The observational and modelling study which used individual-level data from almost 400,000 people, published in The Lancet, extends existing research because it suggests that increasing levels of non-HDL cholesterol may predict long-term cardiovascular risk by the age of 75 years. Past risk estimates of this kind are based on 10-year follow-up data.

For example, women with non-HDL cholesterol levels between 3.7-4.8 mmol/litre, who were younger than 45 years, and had at least two additional cardiovascular risk factors, had a 16% probability of experiencing a cardiovascular disease event by the age of 75 years (ie, 16 in 100 women with these characteristics were predicted to have a cardiovascular event by the age of 75 years). For women aged 60 or over with the same characteristics, the estimated risk was 12%.

For men with the same characteristics, the estimated risk for those aged under 45 years was 29%, and was 21% for those aged 60 years or more.

"This increased risk in younger people could be due to the longer exposure to harmful lipids in the blood. The risk may also appear larger compared to older ages because people aged 60 years and older in our study had not developed cardiovascular disease up to this age, so they may be healthier than others of their age who were excluded from the study because they had had cardiovascular disease," says Professor Barbara Thorand, German Research Center for Environmental Health, Germany. [1]

The amount of non-HDL cholesterol [2] and low-density lipoproteins (LDL) in the blood are accepted as causal risk factors for cardiovascular disease, and play a significant part in predicting a person's risk of developing cardiovascular disease.

The authors say that intervening early and intensively to reduce non-HDL cholesterol levels during the lifespan could potentially reverse early signs of atherosclerosis. However, considerable uncertainty exists about the extent to which slightly increased or apparently normal cholesterol levels affect lifetime cardiovascular risk, and about which levels should be used to make treatment recommendations, particularly in young people.

In the study, the authors used individual-level data from almost 400,000 people from 38 studies from Europe, Australia and North America. The participants had no cardiovascular disease at the start of the study and were followed for up to 43.5 years (median 13.5 years follow-up) for the occurrence of a fatal or non-fatal coronary heart disease event or ischaemic stroke.

Using their data, the authors assessed and confirmed the long-term association between cholesterol levels and cardiovascular event risk. They then used this data in a model to estimate the probability of a cardiovascular event by the age of 75 years for people aged 35-70 years, according to a person's gender, non-HDL cholesterol levels, age, and cardiovascular disease risk factors (such as smoking status, diabetes, BMI, systolic blood pressure, and antihypertensive medication). The model also estimated how much risk could be reduced if non-HDL cholesterol levels were halved (the authors note that the 50% reduction was hypothetical and not based on specific estimates or treatments).

During follow-up, there were 54,542 fatal or non-fatal cases of heart disease and stroke.

Looking at data for all age groups and both sexes, the authors found that the risk for a cardiovascular event decreased continuously with decreasing non-HDL levels and the risk was lowest for those individuals with the lowest non-HDL levels (classified as below 2.6 mmol non-HDL cholesterol per litre in the study). [3]

Using the model to estimate the risk of a cardiovascular event by the age of 75 years for different age groups, the authors found that the highest long-term risks of cardiovascular disease were seen in individuals younger than 45 years of age.

For example, women with non-HDL cholesterol levels between 3.7-4.8 mmol/litre, who were younger than 45 years, and had at least two additional cardiovascular risk factors, had an estimated 16% probability of experiencing a cardiovascular disease event by the age of 75 years (ie, 16 in 100 women with these characteristics were predicted to have a cardiovascular event by the age of 75 years). For women aged 60 or over with the same characteristics, the estimated risk was 12%. For men with the same characteristics, the estimated risk for those aged under 45 years was 29%, and was 21% for those aged 60 years or more.

Using the model to estimate how much cardiovascular disease could be reduced if a person halved their non-HDL cholesterol levels, the authors found that for all non-HDL cholesterol levels, the greatest reductions were seen in the youngest age group compared with older age groups.

For example, in people younger than 45 years with levels of 3.7-4.8 mmol/litre and with at least two risk factors, they estimated that the long-term risk of cardiovascular disease could hypothetically be reduced from 16% to 4% in women, and from 29% to 6% in men. For people with the same characteristics aged 60 years or over, risk could potentially be reduced from 12% to 6% in women and from 21% to 10% in men.

"Our estimates suggest that halving non-HDL cholesterol levels may be associated with reduced risk of cardiovascular events by the age of 75 years, and that this reduction in risk is larger the sooner cholesterol levels are reduced. The risk scores currently used in the clinic to decide whether a person should have lipid-lowering treatment only assess the risk of cardiovascular disease over 10 years, and so may underestimate lifetime risk, particularly in young people," says Professor Stefan Blankenberg, German Center for Cardiovascular Research, Germany. [1]

"In lieu of needed clinical trial results investigating the benefits of long-term lipid-lowering therapy in people younger than 45, this study may provide helpful insights on the benefits of lipid-lowering therapy as primary prevention from an earlier age. However, future research is needed to understand whether intervention in young people with a high lifetime risk, but low 10-year risk, would have more benefits than later intervention," he concludes. [1]

Professor Frank Kee, Queens University Belfast, UK, adds: "Further research is also needed on how useful lifetime absolute risk estimates are for motivating behaviour change among otherwise healthy young people, and whether titrating any intervention dose according to a non-HDL target would be more effective than to a target of overall life-time risk." [1]

The authors note some limitations within their study, including that their study results may not be generalisable to other regions or racial and ethnic groups as the study was based on data from people of European ancestry from high-income countries.

The authors used data about the participants' non-HDL cholesterol levels when they entered the study only, and so could not account for changes in cholesterol levels. However, they note that non-HDL cholesterol levels in young people are generally stable over the 30-year life course. They also could not account for participants beginning to take lipid-lowering therapy during the study, but adjusted cholesterol levels for people who were already taking lipid-lowering therapy at the start of the study.

Lastly, they note that their modelled 50% reduction posits that the effects of treatment apply over a longer period (30 years) than has been studied in clinical trials (around seven years), and note that real-world benefits of lipid-lowering therapies like statins are probably lower than the cholesterol reductions seen in trials because of sub-optimal adherence and side effects.

Writing in a linked Comment, Professor Jennifer G Robinson, University of Iowa, USA, also says that the size of this reduction is the main limitation of the analysis, as only long-term lipid-lowering treatment is likely to achieve a reduction of this size, but the existing evidence for these drugs does not assess decades-long treatment, meaning that the risk of adverse events - which would alter the benefit-to-risk ratio - are unclear.

She writes: "The novelty of Brunner and colleagues' findings arises from projecting the effect of beginning cholesterol-lowering therapy early in life. Such therapy could reduce the lifetime risk of atherosclerotic cardiovascular disease in patients with increased concentrations of non-HDL and LDL cholesterol, especially when risk is further amplified by the presence of comorbid factors. These individuals could be at much lower 10-year risk thresholds of atherosclerotic cardiovascular disease than are currently recommended for consideration of statin therapy. Lowering cholesterol with more intensive therapy is also supported by findings that generic statins are cost-saving or highly cost-effective even for primary prevention in patients at low risk of cardiovascular disease."

Credit: 
The Lancet

NASA analyzes Kammuri's heavy rainfall    

image: Typhoon Kammuri's surface rainfall accumulations estimated from the NASA IMERG  from  Nov. 24 at 7 p.m. EST to Dec 3 at 10 p.m. EST.  Heaviest rains were over the central Philippine Sea where the cyclone stalled. Those were well over 500 mm (~20 inches, in red).  Most of the central Philippines, including southern Luzon, received up to 150 mm or more (over 6 inches, light blue areas) with the highest amounts over the northern half of the island of Samar where rainfall totals ranged from 250 to 350 mm (~10 to 14 inches, shown in yellow and light orange). 

Image: 
NASA GSFC using IMERG data/with the Giovanni online data system, developed and maintained by the NASA GES DISC.

NASA provided analyses of Typhoon Kammuri's heavy rainfall on its track through the Northwestern Pacific Ocean using the Global Precipitation Measurement mission or GPM core satellite.

While the Atlantic hurricane season officially ended on November 30, Typhoon Kammuri (known as Tisoy in the Philippines), which recently struck the central Philippines as a powerful Category 4 typhoon, is a reminder that the Pacific typhoon season is not yet over.  In fact, while typhoon season does peak from around June through November, similar to the Atlantic, typhoons can occur throughout the year in the Pacific.

History of Kammuri

Kammuri first formed into a tropical depression from an area of low pressure on the 25th of November north of Micronesia in the west central Pacific about 500 miles southeast of Guam.  Kammuri intensified slowly and was still a tropical storm when the center passed about 130 miles south of Guam on the evening of Dec. 26.   As the storm made its way through the eastern and central Philippine Sea over the next few days it was kept in check at times by moderate wind shear and hovered around typhoon intensity.

Analyzing Kammuri's Rainfall from Space

During this period, the GPM core satellite overflew the storm.  The first image was taken at on Nov. 29, 2019 at 1:27 p.m. EST (18:27 UTC/Nov. 30, 2019 at 3:27 a.m. local Palau Time, PWT) and shows surface rain rates within Kammuri from the GPM Dual-polarization Radar (DPR) when the storm was about 800 miles east of the Philippines.  At the time, Kammuri was a Category 1 typhoon with sustained winds estimated at 85 mph by the Joint Typhoon Warning Center (JTWC).

GPM, a satellite managed by both NASA and the Japan Aerospace Exploration Agency, showed areas of moderate to heavy rain organized into loose bands rotating around the northern and western side of the storm.  The eye, which is located along the right side of the image, was identifiable by the curvature in the inner rain bands, but the eyewall itself appeared rather weak.  These features are consistent with Kammuri having a well-developed though not yet powerful circulation. That would change over the next few days as Kammuri began to approach the Philippines.

Initially, Kammuri weakened slightly after the time of the GPM overpass, but then on the evening of December 1 (local time), the storm began a rapid deepening cycle and intensified from a Category 1 typhoon with sustained winds estimated at 80 mph by JTWC at 12:00 UTC (7 a.m. EST/9:00 pm PWT) on the Dec. 1 to a Category 4 storm with sustained winds of 130 mph just 24 hours later.  It was at this time that Kammuri made its first landfall in the Philippines around 11:00 p.m. local time near Gubat in the Bicol region in the Province of Sorsogon along the southeastern tip of Luzon.

As it continued on westward through the central Philippines, Kammuri weakened, crossing the island of Mindoro as a Category 2 storm before exiting the Philippines into the eastern South China Sea.

IMERG Finds Heavy Rains in the Philippines

In addition to its powerful winds, Kammuri brought heavy rains to the Philippines.  IMERG, the Integrated Multi-satellitE Retrievals for GPM, is a unified satellite precipitation product produced by NASA to estimate surface precipitation over most of the globe.  IMERG is managed at NASA's Goddard Space Flight Center in Greenbelt, Md.

With IMERG, precipitation estimates from the GPM core satellite are used to calibrate precipitation estimates from microwave and infrared sensors on other satellites to produce half-hourly precipitation maps at 0.1o horizontal resolution.

IMERG surface rainfall accumulations for the period from Nov. 25 through Dec. 3 for the Philippines and the surrounding region from the time when Kammuri first became a tropical depression southeast of Guam until it had passed over Mindoro and into the South China Sea.  The heaviest rains associated with Kammuri by far are off shore, especially over the central Philippine Sea where the cyclone stalled for a period producing rainfall totals well over 500 mm (~20 inches).

Over land, most of the central Philippines, including southern Luzon, received on the order of 150 mm or more (over 6 inches) with the highest amounts over the northern half of the island of Samar where rainfall totals are on the order of 250 to 350 mm (~10 to 14 inches).

So far, Kammuri is being blamed for up to 17 fatalities in the Philippines.  After leaving the Philippines, Kammuri weakened significantly and is expected to weaken even further and dissipate as the cyclone is sheared apart and driven southward by the northeast monsoon.

Kammuri's Status on Dec. 5

On Dec. 5 at 4 a.m. EST (0900 UTC), Tropical Storm Kammuri was in the South China Sea and was dealing with adverse atmospheric conditions, which were weakening the storm. It  was centered near latitude 13.8 degrees north and longitude 113.7 degrees east, about 340 nautical  miles east-southeast of Da Nang, Vietnam. Kammuri was moving to the southwest and had maximum sustained winds near 35 knots (40 mph/65 kph), making it a Category 1 tropical storm.

Kammuri continues to weaken and is expected to dissipate soon.

Credit: 
NASA/Goddard Space Flight Center