Tech

New method for the measurement of nano-structured light fields

image: A monolayer of organic molecules is placed in the focused light field and replies to this illumination by fluorescence, embedding all information about the invisible properties.

Image: 
Pascal Runde

Structured laser light has already opened up various different applications: it allows for precise material machining, trapping, manipulating or defined movement of small particles or cell compartments, as well as increasing the bandwidth for next-generation intelligent computing.

If these light structures are tightly focused by a lens, like a magnifying glass used as burning glass, highly intense three-dimensional light landscapes will be shaped, facilitating a significantly enhanced resolution in named applications. These kinds of light landscapes has paved the way to pioneering applications as Nobel prize awarded STED microscopy.

However, these nano-fields itself could not be measured yet, since components are formed by tight focusing which are invisible for typical measurement techniques. Up to now, this lack of appropriate metrological methods has impeded the breakthrough of nano-structured light landscapes as a tool for material machining, optical tweezers, or high-resolution imaging.

A team around physicist Prof. Dr. Cornelia Denz of the Institute of Applied Physics and chemist Prof. Dr. Bart Jan Ravoo of the Center for Soft Nanoscience at the University of Münster (Germany) successfully developed a nano-tomographic technique which is able to detect the typically invisible properties of nano-structured fields in the focus of a lens - without requiring any complex analysis algorithms or data post-processing. For this purpose, the team combined their knowledge in the field of nano-optics and organic chemistry to realize an approach based on a monolayer of organic molecules. This monolayer is placed in the focused light field and replies to this illumination by fluorescence, embedding all information about the invisible properties.

By the detection of this reply the distinct identification of the nano-field by a single, fast and straightforward camera image is enabled. "This approach finally opens the till now unexploited potential of these nano-structured light landscapes for many more applications," says Cornelia Denz, who is heading the study. The study has been published in the journal "Nature Communications".

Credit: 
University of Münster

NASA catches Tropical Storm Tapah by the tail

image: On Sept. 19, the MODIS instrument that flies aboard NASA's Terra satellite took this image of Tropical Storm Tapah. From the storm's center a large band of thunderstorms resembling a tail extended from its western side, and stretched through the East China Sea and all the way north into the Sea of Japan.

Image: 
NASA Worldview

Tropical Storm Tapah has a huge "tail" on NASA satellite imagery. NASA's Terra satellite captured an image of the northwestern Pacific Ocean storm that revealed a large band of thunderstorms that resemble a large tail. The NASA imagery also indicated that the storm is getting better organized.

On Sept. 19, the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite provided a visible image of Tapah. The image showed the center of the storm was a good distance east of Taiwan and the northern Philippines. From the storm's center a large band of thunderstorms extended from its western side, and stretched through the East China Sea all the way north into the Sea of Japan. That large thunderstorm band made up Tapah's "tail."

The image also showed that there is a large band of powerful thunderstorms circling Tapah's low-level center of circulation. The shape of the storm is a clue to forecasters that a storm is either strengthening or weakening. If a storm takes on a more rounded shape it is getting more organized and strengthening. Conversely, if it becomes less rounded or elongated, it is a sign the storm is weakening. Tapah has appeared to become more symmetrical in the MODIS imagery, indicating it is getting better organized.
At 11 a.m. EDT (1500 UTC) on Sept. 20, the center of Tropical Storm Tapah was located near latitude 24.6 degrees north and longitude 127.1 degrees east. That puts the center about 147 nautical miles south of Kadena Air Force Base, Okinawa, Japan. Maximum sustained winds were near 50 knots (57 mph/92 kph).

The Joint Typhoon Warning Center or JTWC noted that Tapah was moving to the north-northeast. JTWC uses satellite imagery in their forecasts and has indicated that Tapah is strengthening. The JTWC forecast takes Tapah on a curved path to the northwest then northeast and through the Sea of Japan over Sept. 22 and 23.

Hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

For updated forecasts, visit: https://www.nhc.noaa.gov

Credit: 
NASA/Goddard Space Flight Center

Multicultural millennials respond positively to health 'edutainment': Baylor research says

image: Tyrha Lindsey-Warren, Ph.D., clinical assistant professor of marketing in Baylor University's Hankamer School of Business.

Image: 
Robert Rogers, Baylor Marketing & Communications

WACO, Texas (Sept. 19, 2019) - Storytelling that educates and entertains - aka "edutainment" - is a powerful communications tool that can lead to positive health-related changes among multicultural millennials, according to a new marketing study from Baylor University.

Tyrha Lindsey-Warren, Ph.D., clinical assistant professor of marketing in Baylor University's Hankamer School of Business, led the study, "Making multicultural millennials healthy: The influence of health 'edutainment' and other drivers on health-oriented diet change," which is published in the Journal of Cultural Marketing Strategy. Charlene A. Dadzie, Ph.D., assistant professor of marketing at the University of South Alabama, coauthored the research.

The research sought to determine which health issues most concerned multicultural millennials and to gauge how effective media can be as a tool to address those issues and drive change.

"This study finds that by bolstering self-identity and employing health 'edutainment,' it is possible to have a positive impact on the health intentions and behaviors of the millennial generation," the researchers wrote.

There are more than 92 million millennials (ages 20-34) in the United States today, the researchers observed. More than 9 million of those are identified as being overweight, and much of that can be attributed to a sedentary lifestyle and media consumption. White Americans watch an average of 140 hours of television per month (35 hours per week); African Americans watch 213 hours of television per month; and Latino Americans watch 33 hours of television per week and stream more than six hours of video per month.

"Millennials are tech-savvy, they love social media, they're actually more health conscious than previous generations, and they have significant economic power," Lindsey-Warren said. "With so many millennials - in the scope of this study, multicultural millennials - watching so many screens, there is great opportunity to generate information and increase products and services geared towards health-oriented behavior."

But to take advantage of this opportunity, public and private organizations need to understand what drives millennials' health choices and communicate accordingly, the researchers wrote. A total of 265 people participated in two components of the study - a health survey of 245 undergraduate students and one-on-one, in-depth interviews of 20 multicultural millennials.

'I'm old-young and it's getting real.'

For the second part of the study, the researchers interviewed 20 people - 10 women and 10 men - from two organizations, a large northeastern U.S. university and a mid-sized nonprofit organization in Harlem, New York. The subjects represented cultural, socioeconomic and educational diversity.

The interviews were used to better understand the health status of these millennials as well as their relation to storytelling in the media, the researchers said. Participants answered health and wellness questions regarding their own health and personal network (example: "What is your ideal health?") and questions about their personal media usage (example: "What are the top five health and wellness issues you see regularly portrayed in the media?").

One of the strongest themes to emerge from those interviews was that multicultural millennials "long to be healthy in mind, body and spirit" and are open to "seeing authentic and relevant storytelling regarding health issues in the media that is meaningful."

"They would definitely respond to health messages when they truly see themselves in storytelling that meets them where they are in life," the researchers wrote.

Some of the health-related topics addressed by those being interviewed included healthy eating, asthma, sexual health, mental health and fitness.

One interviewee, a 21-year-old woman said she gets "out of breath" when she runs up the stairs and her knees "crack and hurt."

"I would love to have ideal health again. I really would. I would love it. I'm old-young, and it's getting real," she said.

'More involved in the narrative'

In addition to the one-on-one interviews, each of the 20 interviewees watched media clips from two television programs - ABC's "Private Practice" and the nationally syndicated health show, "The Doctors." Each show highlighted the accurate health information concerning attention deficit hyperactivity disorder (ADHD).

"Private Practice" told its story via fictional characters in an episodic dramatic storytelling format. In this format, health issues were not directly promoted and there was no direct-to-camera discussion of those issues.

"The Doctors," on the other hand, utilized the format of real doctors conveying accurate health information in real-life situations, directly to the camera and in front of a studio audience.

"From the interviews, it was apparent that the storytelling in the health edutainment stimuli worked because the participants enjoyed and were more involved in the narrative conveyed in 'Private Practice' versus 'The Doctors,'" the researchers wrote. "For example, the 'Private Practice' segment told the story of a young boy and his parents who desperately asked their doctor to give them a prescription for ADHD medicine for their son, even though the son did not want the medicine and the doctor felt that the prescription might not have been needed."

The "Private Practice" story resonated better with those watching and scored high across genders, according to the study.

One 21-year-old male university student said he was diagnosed with ADHD as a child and saw himself and his parents in the "Private Practice" clip. He said he took ADHD medicine for a while.

"I didn't like it and I stopped taking it, and that was it. My parents were, 'OK - if you don't like it, that's the way it is - you're going to study harder, though. And, that was it," he told the interviewers.

A 28-year-old female from the nonprofit program said she saw the "Private Practice" clip and could relate to the situation as a parent.

"I felt I could relate because I felt that my son had ADHD, and I really, I kind of diagnosed him myself, and said that, so I was really interested in this topic," she told the researchers.

Marketing and advertising implications

Given the constant barrage of media in the lives of millennials, it is only reasonable to question the effect of this environment on their health and well-being, Lindsey-Warren said.

The findings of the study are useful for practitioners in marketing, advertising, public relations, digital and branded entertainment.

"Ultimately, the key to making a difference in the lives of multicultural millennials and their health, both now and into the future, may be achieving the right balance of educating and entertaining them," the researchers wrote.

"For millennials, Gen Z and alpha - the newest generation - 'edutainment' is and will continue to be a primary way to educate them," Lindsey-Warren said. "It's through the stories we tell on digital, on streaming, on gaming - that's the way those generations are learning."

Credit: 
Baylor University

Innovative data and analytics platform to accelerate drug development for rare diseases

TUCSON, Ariz. and WASHINGTON, September 20, 2019 -- The Critical Path Institute (C-Path) and the National Organization for Rare Disorders® (NORD) launched the Rare Disease Cures Accelerator-Data and Analytics Platform (RDCA-DAP) in Rockville, MD on Tuesday, Sept. 17. The platform, funded by a cooperative agreement through the Food and Drug Administration, [Critical Path Public-Private Partnerships Grant Number U18 FD005320], will provide data and analytics to aid in the understanding of rare diseases and to inform long-term drug development and support innovative trial designs.

"People with rare diseases need treatments; we need to do what we can to make development of those treatments as efficient, effective and fast as possible," said Center for Drug Evaluation and Research Director Janet Woodcock, MD. "The way to do that is to have all the data we've been talking about brought to bear on how we test the interventions -- the Rare Disease Cures Accelerator-Data and Analytics Platform is the vehicle that can deliver that data to the developers and the community."

The launch meeting, attended by more than 150 individuals from patient groups, industry and regulatory agencies, plus hundreds more via live stream, served to inform the rare disease community about the new platform, and to seek input on its development. FDA representatives Theresa Mullin, PhD, Associate Director for Strategic Initiatives and Billy Dunn, MD, Director, Division of Neurology Products, explained how RDCA-DAP fits into the FDA's vision for the future of drug development for rare diseases, and how it will provide tools to aid in understanding the trajectory of rare diseases and accelerate development of new treatments and cures. Dunn emphasized the importance of sharing and aggregating data, especially in the context of rare diseases, and how this helps to inform clinical trial design.

"We have tremendous experience with C-Path and NORD with regard to our approaches to data. It's truly altruistic and it's about bettering the community and allowing every member of the scientific and patient community to benefit from aggregated data," Dunn said. "There's increased recognition in the scientific community that being a good scientific citizen means sharing your data."

Panels of representatives from industry and patient groups discussed problems encountered in rare disease drug development and the need for this infrastructure to help get past those bottlenecks. Several successful programs that have accelerated efforts to develop treatments in specific disease areas were highlighted, which will inform the development of this new pan-rare disease platform. RDCA-DAP is designed to collaborate with existing efforts in this space.

"Getting all the key opinion leaders, patients and key stakeholders involved is absolutely essential," said Rosángel Cruz, MS, Director of Research and Clinical Affairs, Cure SMA. "Let's get together and learn each other's language. Sometimes in rare diseases we end up working in silos and as such, data ends up in silos, and there isn't a way for us to all come together and share data and learnings from that data. RDCA-DAP is important to the entire rare disease community."

NORD Director of Research Programs Vanessa Boulanger, MS, described the IAMRARE™ Registry Program that will serve as an initiation point for stakeholders looking to systematically collect natural history study and patient registry data. The data can be shared with the RDCA-DAP, as one way to ensure that the rare disease community informs the development, utilization and impact of the platform. C-Path Director of Clinical Pharmacology and Quantitative Medicine Klaus Romero, MD, described the infrastructure that is already in place to aggregate data at C-Path and how it may be utilized by the rare disease community to accelerate drug development, as well as new features being designed specifically for RDCA-DAP.

Credit: 
Critical Path Institute (C-Path)

Medications underused in treating opioid addiction, Mayo Clinic expert says

ROCHESTER, Minn. -- Though research shows that medication-assisted treatment can help people who are addicted to opioids, the three drugs approved by the Food and Drug Administration (FDA) are underused, according to a review of current medical data on opioid addiction in the U.S. This review appears in the October issue of Mayo Clinic Proceedings.

Along with addiction counseling, the drugs naltrexone, buprenorphine and methadone all have a place in treatment for opioid use disorder, says Tyler Oesterle, M.D., medical director of Mayo Clinic Health System's Fountain Centers drug and alcohol treatment programs. Evidence of the three drugs' effectiveness in treating opioid use disorder is well-established, says Dr. Oesterle, the review's lead author. This review uses data from available medical literature to provide a framework for determining the optimal approach for medication-assisted treatments.

"We have an opioid epidemic in this country that has been caused by many factors, including overzealous use of medication, the widespread availability of legal and illegal opioids, and societal expectations that all pain can be eliminated," Dr. Oesterle says. "We clearly cannot medicate our way out of the problem, but we have the opportunity to mediate the problem through more judicious use of prescription opioids."

Each drug has strengths and weaknesses, and the appropriate risks and benefits should be discussed with each patient suffering from an opioid use disorder, according to the study.

Naltrexone, which is approved to treat opioid and alcohol dependence, and block the effects of opioids in adults, is longer-acting and ideal as an opioid blocking agent, the review says. Patient compliance with buprenorphine is relatively high and associated with improved rates of sobriety and a reduction in accidental overdoses. The principal benefits of methadone are relief of narcotic craving, suppression of withdrawal syndrome, and blocking of the euphoric effects associated with heroin.

According to the review, the three drugs may be underused in part because access is limited by some legal requirements regarding who can write prescriptions. The one exception is naltrexone, which can be ordered by any prescriber.

Another challenge in treating opioid use disorder is that it can be slow in developing, making it difficult to identify for primary care providers. "The development of an opioid use disorder can happen slowly, over time, and that makes it difficult to identify in primary care," Dr. Oesterle says. "We are currently researching better ways to identify details and advise patients."

Effectively responding to the opioid crisis requires moving beyond a medication-only approach, according to Dr. Oesterle. "We need to establish a generalizable framework that utilizes the full repertoire of responses and resources we have at our disposal." This includes medications, counseling, mental health services, workforce rehabilitation and social support, he says.

Credit: 
Mayo Clinic

Comparing major adverse cardiovascular events among patients with diabetes, reduced kidney function treated with metformin or sulfonylurea

What The Study Did: This observational study compared major cardiovascular events (including hospitalization for heart attack, stroke, transient ischemic attack or cardiovascular death) among patients with diabetes and reduced kidney function treated with metformin or a sulfonylurea (a class of drugs to treat diabetes).

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

Authors: Christianne L. Roumie, M.D., M.P.H., of the Nashville VA Medical Center in Nashville, is the corresponding author.

(doi:10.1001/jama.2019.13206)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Nano bulb lights novel path

image: An electron microscope image shows an array of thermal light emitters created by Rice University engineers. The emitters are able to deliver highly configurable thermal light. (Credit: The Naik Lab/Rice University)

Image: 
The Naik Lab/Rice University

HOUSTON - (Sept. 19, 2019) - What may be viewed as the world's smallest incandescent lightbulb is shining in a Rice University engineering laboratory with the promise of advances in sensing, photonics and perhaps computing platforms beyond the limitations of silicon.

Gururaj Naik of Rice's Brown School of Engineering and graduate student Chloe Doiron have assembled unconventional "selective thermal emitters" -- collections of near-nanoscale materials that absorb heat and emit light.

Their research, reported in Advanced Materials, one-ups a recent technique developed by the lab that uses carbon nanotubes to channel heat from mid-infrared radiation to improve the efficiency of solar energy systems.

The new strategy combines several known phenomena into a unique configuration that also turns heat into light -- but in this case, the system is highly configurable.

Basically, Naik said, the researchers made an incandescent light source by breaking down a one-element system -- the glowing filament in a bulb -- into two or more subunits. Mixing and matching the subunits could give the system a variety of capabilities.

"The previous paper was all about making solar cells more efficient," said Naik, an assistant professor of electrical and computer engineering. "This time, the breakthrough is more in the science than the application. Basically, our goal was to build a nanoscale thermal light source with specific properties, like emitting at a certain wavelength, or emitting extremely bright or new thermal light states.

"Previously, people thought of a light source as just one element and tried to get the best out of it," he said. "But we break the source into many tiny elements. We put sub-elements together in such a fashion that they interact with each other. One element may give brightness; the next element could be tuned to provide wavelength specificity. We share the burden among many small parts.

"The idea is to rely upon collective behavior, not just a single element," Naik said. "Breaking the filament into many pieces gives us more degrees of freedom to design the functionality."

The system relies on non-Hermitian physics, a quantum mechanical way to describe "open" systems that dissipate energy -- in this case, heat -- rather than retain it. In their experiments, Naik and Doiron combined two kinds of near-nanoscale passive oscillators that are electromagnetically coupled when heated to about 700 degrees Celsius. When the metallic oscillator emitted thermal light, it triggered the coupled silicon disk to store the light and release in the desired manner, Naik said.

The light-emitting resonator's output, Doiron said, can be controlled by damping the lossy resonator or by controlling the level of coupling through a third element between the resonators. "Brightness and the selectivity trade off," she said. "Semiconductors give you a high selectivity but low brightness, while metals give you very bright emission but low selectivity. Just by coupling these elements, we can get the best of both worlds."

"The potential scientific impact is that we can do this not just with two elements, but many more," Naik said. "The physics would not change."

He noted that though commercial incandescent bulbs have given way to LEDs for their energy efficiency, incandescent lamps are still the only practical means to produce infrared light. "Infrared detection and sensing both rely on these sources," Naik said. "What we've created is a new way to build light sources that are bright, directional and emit light in specific states and wavelengths, including infrared."

The opportunities for sensing lie at the system's "exceptional point," he said.

"There's an optical phase transition because of how we've coupled these two resonators," Naik said. "Where this happens is called the exceptional point, because it's exceptionally sensitive to any perturbation around it. That makes these devices suitable for sensors. There are sensors with microscale optics, but nothing has been shown in devices that employ nanophotonics."

The opportunities may also be great for next-level classical computing. "The International Roadmap for Semiconductor Technology (ITRS) understands that semiconductor technology is reaching saturation and they're thinking about what next-generation switches will replace silicon transistors," Naik said. "ITRS has predicted that will be an optical switch, and that it will use the concept of parity-time symmetry, as we do here, because the switch has to be unidirectional. It sends light in the direction we want, and none comes back, like a diode for light instead of electricity."

Credit: 
Rice University

Tumor resistance is promoted by anti-cancer protein

Lack of oxygen, or hypoxia, is a biological stressor that occurs under various conditions such as wound healing and stroke. To rescue the tissue, the body has innate mechanisms that "kick in" to make the cells of the hypoxic tissue more resistant and assist in tissue repair. One such mechanism is the expression of a protein called Hypoxia Induction Factor (HIF), which controls several processes such as glucose uptake, growth of blood vessels and cell proliferation. Despite its beneficial role in some diseases, HIF has also been found to be an important contributor towards cancer progression.

For many years, scientists have been trying to understand why a well-known tumor suppressor protein called p53, is unable to impair the growth of cancer cells in hypoxic areas of solid tumors. Many studies have tried to elucidate the relationship between hypoxia, HIF, and p53, without clear conclusions. Now, a team of scientists led by Dr. Rajan Gogna, of the Champalimaud Centre for the Unknown in Lisbon, Portugal, have identified the source of the tumor's resistance to p53. Their results were published in the scientific journal Nucleic Acids Research.

To investigate this question, the multi-institutional team, which included groups in Portugal, the United States, the United Kingdom, India and Japan, carefully measured and simulated physiological hypoxia in tissue from humans and investigated the molecular changes that were induced in that tissue.

Using this approach, the team uncovered the answer to the longstanding question they were facing: they discovered that lack of oxygen alters the shape of p53, thereby inhibiting its ability to perform its role. "Our analysis showed that when p53 is subjected to hypoxic conditions, this protein changes its conformation and therefore is unable to bind to the DNA of cancer cells", Gogna explains.

This realisation clarified why p53 was not effective under hypoxia, but then, the team made a surprising discovery -- hypoxic cancer cells were in fact producing p53 in large quantities. This unexpected result led the team to investigate further the changes that were happening in the tissue.

Their analysis revealed that the shape p53 assumes under hypoxic conditions actually leads it to bind to HIF and stabilise it, thereby facilitating HIF's pro-survival action in cancer cells. "Not only is p53 unable to suppress the tumor, it actually generates genetic and molecular changes that promote its survival", says Gogna.

According to Gogna, these key findings may have important clinical consequences: "Since hypoxic and non-hypoxic areas will respond differently to chemo and radiotherapy, clinicians might want to measure how much of the tumor is hypoxic and make their therapeutic plan accordingly. In addition, observing the expression of p53 within tumors could potentially indicate how aggressive is the tumor."

Gogna adds that this discovery is an example of a basic research project that yields results with clinical implications. "Understanding this new molecular pathway is important for cancer as well as for other diseases that involve manifestation of chronic hypoxia, which include, among others, chronic inflammatory bowel disease, rheumatoid arthritis, epilepsy and cardiac hypertrophy."

Finally, Gogna concludes by saying that this research has special focus on pancreatic cancer, as "hypoxia-assisted resistance to chemotherapy is one of the most frustrating menaces associated with this disease. This study may help manufacture new anti-cancer drugs that will reduce the resistance caused by this molecular pathway."

Credit: 
Champalimaud Centre for the Unknown

Study estimates more than 100,000 cancer cases could stem from contaminants in tap water

WASHINGTON - A toxic cocktail of chemical pollutants in U.S. drinking water could result in more than 100,000 cancer cases, according to a peer-reviewed study from Environmental Working Group - the first study to conduct a cumulative assessment of cancer risks due to 22 carcinogenic contaminants found in drinking water nationwide.

In a paper published today in the journal Heliyon, EWG scientists used a novel analytical framework that calculated the combined health impacts of carcinogens in 48,363 community water systems in the U.S. This assessment does not include water quality information for the 13.5 million American households that rely on private wells for their drinking water.

"Drinking water contains complex mixtures of contaminants, yet government agencies currently assess the health hazards of tap water pollutants one by one," said Sydney Evans, lead author of the paper and a science analyst at EWG. "In the real world, people are exposed to combinations of chemicals, so it is important that we start to assess health impacts by looking at the combined effects of multiple pollutants."

This cumulative approach is common in assessing the health impacts of exposure to air pollutants but has never before been applied to a national dataset of drinking water contaminants. This model builds on a cumulative cancer risk assessment of water contaminants in the state of California and offers a deeper insight into national drinking water quality. As defined by U.S. government agencies, the calculated cancer risk applies to a statistical lifetime, or approximately 70 years.

Most of the increased cancer risk is due to contamination with arsenic, disinfection byproducts and radioactive elements such as uranium and radium. Water systems with the highest risk tend to serve smaller communities and rely on groundwater. These communities often need improved infrastructure and resources to provide safe drinking water to their residents. However, large surface water systems contribute a significant share of the overall risk due to the greater population served and the consistent presence of disinfection byproducts.

"The vast majority of community water systems meet legal standards," said Olga Naidenko, Ph.D., EWG's vice president for science investigations. "Yet the latest research shows that contaminants present in the water at those concentrations - perfectly legal - can still harm human health."

"We need to prioritize source water protection, to make sure that these contaminants don't get into the drinking water supplies to begin with," Naidenko added.

Consumers who are concerned about chemicals in their tap water can install a water filter to help reduce their exposure to contaminants. Filters should be targeted to the specific contaminants detected in the tap water.

Credit: 
Environmental Working Group

Study shows both natural variation in ACE concentrations and lowering blood pressure with ACE inhibitors are associated with lower risk of type 2 diabetes

New research presented at this year's Annual Meeting of the European Association for the Study of Diabetes (EASD) in Barcelona, Spain (16-20 September) shows that usage of angiotensin-converting enzyme (ACE) inhibitors to lower blood pressure, is associated with a 24% reduced risk of developing type 2 diabetes (T2D) when compared with placebo.

Furthermore, natural genetic variations related to ACE concentrations in the body are also related to T2D risk. The study is by Assistant Professor Marie Pigeyre of the Genetic and Molecular Epidemiology Laboratory, Hamilton Health Sciences and McMaster University, Hamilton, ON, Canada and colleagues.

Although previous research has suggested that ACE inhibitors may prevent T2D, the causal relationship between ACE inhibition and prevention from T2D remains questionable. In this new study, the authors used a 'Mendelian Randomisation' approach. Specifically, they assessed a person's risk of developing T2D based on natural genetic variations that influence the concentration of the ACE enzyme in the blood, and used this to infer the causal effects that ACE inhibitors would have on T2D risk.

First, the authors assessed the association between T2D prevalence and ACE serum concentration in the Outcome Reduction with Initial Glargine Intervention (ORIGIN) trial (N=8,197). Next, they investigated whether 17 genetic variants linked to lower ACE concentrations in the ORIGIN study (N=4,147) were also linked to lower prevalence of T2D in the DIAbetes Genetics Replication And Meta-analysis consortium (n=26,676 cases; 132,532 controls).

The researchers then constructed an ACE concentration-lowering genetic risk score (GRS) and tested it for association with T2D prevalence in the UK Biobank cohort (N=341,872). Finally, they compared the genetically determined effect of lower ACE concentrations on T2D risk to the pharmacological inhibition of ACE versus placebo, with a meta-analysis of randomised clinical trials (including 31,200 patients).

The MR analysis showed that a lower genetically determined ACE serum concentration predicted a lower risk of type 2 diabetes, and a meta-analysis of six RCTs estimated that ACE inhibitors reduced type 2 diabetes risk by 24% compared with placebo.

The authors say: "These results support the protective effect of long-term ACE inhibition on type 2 diabetes risk. Although future research is needed to more accurately clarify the metabolic actions of ACE inhibitors, current evidence supports that targeting ACE may protect a person from developing type 2 diabetes. Furthermore, considering a patient's risk of developing type 2 diabetes may be recommended when prescribing blood-pressure lowering drugs -- if at high risk of type 2 diabetes, an ACE inhibitor could be considered."

They add: "Current guidelines do not take account of the protective effect of ACE inhibitors on T2D risk and this is something that could be considered."

Credit: 
Diabetologia

'Nanochains' could increase battery capacity, cut charging time

image: Artistic depiction of a coin cell battery with a copper electrode (left) containing a black nanochain structure, which researchers have discovered could increase the capacity of a battery and cut charging time.

Image: 
Purdue University illustration/Henry Hamann

WEST LAFAYETTE, Ind. -- How long the battery of your phone or computer lasts depends on how many lithium ions can be stored in the battery's negative electrode material. If the battery runs out of these ions, it can't generate an electrical current to run a device and ultimately fails.

Materials with a higher lithium ion storage capacity are either too heavy or the wrong shape to replace graphite, the electrode material currently used in today's batteries.

Purdue University scientists and engineers have introduced a potential way that these materials could be restructured into a new electrode design that would allow them to increase a battery's lifespan, make it more stable and shorten its charging time.

The study, appearing as the cover of the September issue of Applied Nano Materials, created a net-like structure, called a "nanochain," of antimony, a metalloid known to enhance lithium ion charge capacity in batteries.

The researchers compared the nanochain electrodes to graphite electrodes, finding that when coin cell batteries with the nanochain electrode were only charged for 30 minutes, they achieved double the lithium-ion capacity for 100 charge-discharge cycles.

Some types of commercial batteries already use carbon-metal composites similar to antimony metal negative electrodes, but the material tends to expand up to three times as it takes in lithium ions, causing it to become a safety hazard as the battery charges.

"You want to accommodate that type of expansion in your smartphone batteries. That way you're not carrying around something unsafe," said Vilas Pol, a Purdue associate professor of chemical engineering.

Through applying chemical compounds - a reducing agent and a nucleating agent - Purdue scientists connected the tiny antimony particles into a nanochain shape that would accommodate the required expansion. The particular reducing agent the team used, ammonia-borane, is responsible for creating the empty spaces - the pores inside the nanochain - that accommodate expansion and suppress electrode failure.

The team applied ammonia-borane to several different compounds of antimony, finding that only antimony-chloride produced the nanochain structure.

"Our procedure to make the nanoparticles consistently provides the chain structures," said P. V. Ramachandran, a professor of organic chemistry at Purdue.

The nanochain also keeps lithium ion capacity stable for at least 100 charging-discharging cycles. "There's essentially no change from cycle 1 to cycle 100, so we have no reason to think that cycle 102 won't be the same," Pol said.

Henry Hamann, a chemistry graduate student at Purdue, synthesized the antimony nanochain structure and Jassiel Rodriguez, a Purdue chemical engineering postdoctoral candidate, tested the electrochemical battery performance.

The electrode design has the potential to be scalable for larger batteries, the researchers say. The team plans to test the design in pouch cell batteries next.

Credit: 
Purdue University

NASA analyzes rainfall rates in new Tropical Storm Tapah

image: The GPM core satellite passed over strengthening Tropical Storm Tapah in the northwestern Pacific Ocean on Sept. 16, 2019 at 12:11 p.m. EDT (1611 UTC) and found the heaviest rainfall (pink) falling at as much as 1.6 inches (40 mm) per hour.

Image: 
NASA/JAXA/NRL

Tropical Storm Tapah formed quickly in the northwestern Pacific Ocean and as it was strengthening from a depression to a tropical storm, the Global Precipitation Measurement mission or GPM core satellite passed overhead from its orbit in space and measured rainfall rates throughout the storm.

NASA has the unique capability of peering under the clouds in storms and measuring the rate in which rain is falling. The GPM's core satellite passed over Tropical Storm Tapah in the northwestern Pacific Ocean on Sept. 16 at 12:11 p.m. EDT (1611 UTC).

GPM found the heaviest rainfall around the storm's center, where it was falling at a rate of as much as 1.6 inches (40 mm) per hour. Forecasters at the Joint Typhoon Warning Center incorporate the rainfall data into their forecasts.

Both the Japan Aerospace Exploration Agency (JAXA) and NASA manage GPM.

At 11 a.m. EDT (1500 UTC), the center of Tropical Storm Tapah was located near latitude 23.1 degrees north and longitude 127.9 degrees east. That puts Tapah's center about 211 nautical miles south of Kadena Air Base, Okinawa, Japan. Maximum sustained winds remain near 40 mph (46 kph) with higher gusts. Tapah is forecast to strengthen but remain a tropical storm over the next several days.

Credit: 
NASA/Goddard Space Flight Center

Where to park your car, according to math

video: A primer on simple parking strategies, based on the research paper by Paul Krapivsky and Sidney Redner: http://dx.doi.org/10.1088/1742-5468/ab3a2a

Image: 
Michael Garfield for the Santa Fe Institute

Just as mathematics reveals the motions of the stars and the rhythms of nature, it can also shed light on the more mundane decisions of everyday life. Where to park your car, for example, is the subject of a new look at a classic optimization problem by physicists Paul Krapivsky (Boston University) and Sidney Redner (Santa Fe Institute) published in this week's Journal of Statistical Mechanics.

The problem assumes what many of us can relate to when exhausted, encumbered, or desperate to be somewhere else: the best parking space is the one that minimizes time spent in the lot. So that space by the front door is ideal, unless you have to circle back three times to get it. In order to reduce the time spent driving around the lot AND walking across it, the efficient driver must decide whether to go for the close space, quickly park further out, or settle for something in-between.

"Mathematics allows you to make intelligent decisions," Redner says. "It allows you to approach a complex world with some insights."

In their paper, Krapivsky and Redner map three simple parking strategies onto an idealized, single row parking lot. Drivers who grab the first space available follow what the authors call a "meek" strategy. They "waste no time looking for a parking spot," leaving spots near the entrance unfilled. Those who gamble on finding a space right next to the entrance are "optimistic." They drive all the way to the entrance, then backtrack to the closest vacancy. "Prudent" drivers take the middle path. They drive past the first available space, betting on the availability of at least one other space further in. When they find the closest space between cars, they take it. If no spaces exist between the furthest parked car and the entrance, prudent drivers backtrack to the space a meek driver would have claimed straightaway.

Despite the simplicity of the three strategies, the authors had to use multiple techniques to compute their relative merits. Oddly enough, the meek strategy mirrored a dynamic seen in the microtubules that provide scaffolding within living cells. A car that parks immediately after the furthest car corresponds to a monomer glomming on to one end of the microtubule. The equation that describes a microtubule's length-- and sometimes dramatic shortening-- also described the chain of "meek" cars that accumulate at the far end of the lot.

"Sometimes there are connections between things that seem to have no connection," Redner says. "In this case, the connection to microtubule dynamics made the problem solvable."

To model the optimistic strategy, the authors wrote a differential equation. Once they began to mathematically express the scenario, they spotted a logical shortcut which greatly simplified the number of spaces to consider.

The prudent strategy, according to Redner, was "inherently complicated" given the many spaces in play. The authors approached it by creating a simulation that allowed them to compute, on average, the average density of spots and the amount of backtracking required.

So which strategy is best? As the name suggests, the prudent strategy. Overall, it costs drivers the least amount of time, followed closely by the optimistic strategy. The meek strategy was "risibly inefficient," to quote the paper, as the many spaces it left empty created a lengthy walk to the entrance.

Redner acknowledges that the optimization problem sacrifices much real-world applicability in exchange for mathematical insight. Leaving out competition between cars, for example, or assuming cars follow a uniform strategy under each scenario, are unrealistic assumptions that the authors may address in a future model.

"If you really want to be an engineer you have to take into account how fast people are driving, the actual designs of the parking lot and spaces -- all these things," he remarks. "Once you start being completely realistic, [every parking situation is different] and you lose the possibility of explaining anything."

Still, for Redner, it's all about the joy of thinking analytically about everyday situations.

"We're living in a crowded society and we always encounter crowding phenomena in parking lots, traffic patterns, you name it," he says. "If you can look at it with the right eyes, you can account for something."

Credit: 
Santa Fe Institute

The next agricultural revolution is here

image: Cold Spring Harbor Laboratory's Uplands Farm has a history of ground-breaking plant research and environmental activism.

Image: 
CSHL/2019

As a growing population and climate change threaten food security, researchers around the world are working to overcome the challenges that threaten the dietary needs of humans and livestock. A pair of scientists is now making the case that the knowledge and tools exist to facilitate the next agricultural revolution we so desperately need.

Cold Spring Harbor Laboratory (CSHL) Professor Zach Lippman, a Howard Hughes Medical Institute investigator, recently teamed up with Yuval Eshed, an expert in plant development at the Weizmann Institute of Science in Israel, to sum up the current and future states of plant science and agriculture.

Their review, published in Science, cities examples from the last 50 years of biological research and highlights the major genetic mutations and modifications that have fueled past agricultural revolutions. Those include tuning a plant's flowering signals to adjust yield, creating plants that can tolerate more fertilizer or different climates, and introducing hybrid seeds to enhance growth and resist disease.

Beneficial changes like these were first discovered by chance, but modern genomics has revealed that most of them are rooted in two core hormonal systems: Florigen, which controls flowering; and Gibberrellin, which influences stem height.

Lippman and Eshed suggest that in an age of fast and accurate gene editing, the next revolutions do not need to wait for chance discoveries. Instead, by introducing a wide variety of crops to changes in these core systems, the stage can be set to overcome any number of modern-day challenges.

Dwarfing and flower power revolutions

To explain their point, the scientists reviewed research that focused on key moments in agricultural history, such as the Green Revolution.

Before the 1960s, fertilizing for a large wheat yield would result in the plants growing too tall. Weighed down with their grainy bounty, the wheat stems would fold and rot away, resulting in yield losses. It was only after Nobel laureate Norman Borlaug began working with mutations that affect the Gibberellin system that wheat became the shorter and reliable crop we know today. Borlaug's dwarfing was also applied to rice, helping many fields weather storms that would have been catastrophic only years before. This reapplication of the same technique to a different plant hinted that a core system was in play.

More recent examples Lippman and Eshed mention include the changes undergone by cotton crops in China. There, growers turned the normally sprawling, southern plantation plant into a more compact, faster flowering bush better suited for China's northern climate. To do so, they took advantage of a mutation that affects Florigen, which promotes flowering, and its opposite, Antiflorigen.

This kind of change is related to Lippman's works. He often works with tomatoes and explained that an Antiflorigen mutation in tomato was also the catalyst that transformed the Mediterranean vine crop into the stout bushes grown in large-scale agricultural systems throughout the world today. What's striking, Lippman said, is that cotton is quite unlike any tomato.

"They're evolutionary very different in terms of the phylogeny of plants. And despite that, what makes a plant go from making leaves to making flowers is the same," he said. "That core program is deeply conserved."

Fine-tuning a revolution

As the review details, this has defined what makes an agricultural revolution. A core system--either Gibberellin, Florigen, or both--is affected by a mutation, resulting in some helpful trait. In a moment of pure serendipity, the plants boasting this trait are then discovered by the right person.

It then takes many more years of painstaking breeding to tweak the intensity of that mutation until it affects the system just right for sustainable agriculture. It's like tuning an instrument to produce the perfect sound.

Lippman and Eshed note that CRISPR gene editing is speeding up that tuning process. However, they show that the best application of gene editing may not be to just tune preexisting revolutionary mutations, but instead, to identify or introduce new ones.

"If past tuning has been creating genetic variation around those two core systems, maybe we can make more variety within those systems," he said. "It would certainly mitigate the amount of effort required for doing that tuning, and has the potential for some surprises that could further boost crop productivity, or adapt crops faster to new conditions."

A future in... chickpeas?

More of that genetic variety could also set the stage for new agricultural revolutions. By introducing genetic variation to those two core systems that define most revolutions, farmers might get to skip the serendipitous waiting game. Chickpea is one example.

"There's a lot more room for us to be able to create more genetic diversity that might increase productivity and improve adaptation survival in marginal grounds, like in drought conditions," Lippman said.

Drought resistance is just one benefit of under-utilized crops. Past revolutions have allowed crops to be more fruitful or to grow in entirely new hemispheres. Having a means to continue these revolutions with more crops and at a greater frequency would be a boon in a crowded, hungry, and urbanizing world.

"Given that rare mutations of Florigen/Antiflorigen and Gibberellin/DELLA mutations spawned multiple revolutions in the past, it is highly likely that creating novel diversity in these two hormone systems will further unleash agricultural benefits," the scientists wrote.

Credit: 
Cold Spring Harbor Laboratory

Nearly three billion fewer birds in North America since 1970

North America has lost nearly three billion birds since 1970, according to a new report, which also details widespread population declines among hundreds of North American bird species, including those once considered abundant. The results signal a long-developing yet largely overlooked biodiversity crisis occurring in avifaunal habitats across North America. Human impacts have contributed to an increase in global extinctions. Research focused on understanding extinction is underway, but much of it fails to recognize ongoing declines in "abundance" within still-common species, even as such declines can have significant ecological, evolutionary and economic impacts. "Given the current pace of global environmental change, quantifying change in species abundances is essential to assess ecosystem impacts," Kenneth V. Rosenberg and colleagues say. Evaluating these declines requires large and long-term datasets, which do not exist for most animals. However, long-term, detailed records do exist for bird populations. Using multiple standardized bird-monitoring datasets, Rosenberg et al. analyzed the net change over recent decades in numbers of birds for 529 species in the continental United States and Canada. Their results show a loss of nearly one in four birds since 1970 - a net loss of 2.9 billion birds. According to the authors, more than 90% of this loss can be attributed to 12 bird families, including songbird species like sparrows and warblers. However, not all species are on the decline; some bird species, including raptors and waterfowl, showed population gains - likely due to focused conservation efforts and Endangered Species legislation. Similar strategies for other species could avert the potential collapse of North American avifauna, the authors say. To expand their analysis, the authors used migration data from the NEXRAD radar network to estimate long-term changes in nocturnal migratory passage. The results, similar to those from the ground-based bird-monitoring datasets, reveal a steep decline for migrating birds over a recent 10-year period, particularly in the eastern U.S.

Credit: 
American Association for the Advancement of Science (AAAS)