Tech

NASA-NOAA satellite studies tropical storm Kiko's center

image: NASA-NOAA's Suomi NPP satellite passed over Tropical Storm Kiko on Sept. 17 and revealed a circular area of powerful storms around the low-level center. The image showed strong bands of thunderstorms were located over the northern and western quadrants of the storm.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

Hurricane Kiko weakened to a tropical storm, but imagery from NASA-NOAA's Suomi NPP satellite showed that the storm has maintained strength in the circular area of powerful storms around the low-level center.

The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard Suomi NPP provided a visible image of Kiko on Sept. 17 revealed that powerful storms circled the low-level center. The image showed strong bands of thunderstorms located over the northern and western quadrants of the storm.

Hurricanes are the most powerful weather event on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

NOAA's National Hurricane Center or NHC said, "At 11 a.m. EDT (1500 UTC) on Sept. 18, the center of Tropical Storm Kiko was located near latitude 16.0 degrees north and longitude 126.7 degrees west." That is 1,190 miles (1,920 km) west-southwest of the southernmost tip of Baja California, Mexico.

Kiko is moving toward the west-southwest near 6 mph (9 kph). A westward track is likely later today, followed by a west-northwest motion on Thursday and Friday, and a westward motion on Saturday. Maximum sustained winds are near 60 mph (95 kph) with higher gusts. The estimated minimum central pressure is 1001 millibars. NHC said, "Kiko appears to be stronger this morning with very deep convection near the center, although the cloud pattern is somewhat distorted."

NHC indicated that Kiko could become a hurricane again on Friday or Saturday [Sept. 20 or 21].

Credit: 
NASA/Goddard Space Flight Center

Sound of the future: A new analog to quantum computing

image: Pierre Deymier (right) and UA President Robert C. Robbins examine the acoustic system that allowed researchers to create Bell states using phonons.

Image: 
Paul Tumarkin/Tech Launch Arizona

Human beings create a lot of data in the digital age - whether it's through everyday items like social media posts, emails and Google searches, or more complex information about health, finances and scientific findings.

The International Data Corp. reported that the global datasphere contained 33 zettabytes, or 33 trillion gigabytes, in 2018. By 2025, they expect that number to grow to 175 zettabytes. 175 zettabytes of information stored on DVDs would fill enough DVDs to circle Earth 222 times.

While quantum computing has been touted as a way to intelligently sort through big data, quantum environments are difficult to create and maintain. Entangled quantum bit states, or qubits, usually last less than a second before collapsing. Qubits are also highly sensitive to their surrounding environments and must be stored at cryogenic temperatures.

In a paper published in Nature Research's journal, Communications Physics, researchers in the University of Arizona Department of Materials Science and Engineering have demonstrated the possibility for acoustic waves in a classical environment to do the work of quantum information processing without the time limitations and fragility.

"We could run our system for years," said Keith Runge, director of research in the Department of Materials Science and Engineering and one of the paper's authors. "It's so robust that we could take it outside to a tradeshow without it being perturbed at all - earlier this year, we did."

Materials science and engineering research assistant professor Arif Hasan led the research. Other co-authors include MSE research assistant professor Lazaro Calderin; undergraduate student Trevor Lata; Pierre Lucas, professor of MSE and optical sciences; and Pierre Deymier, MSE department head, member of the applied mathematics Graduate Interdisciplinary Program, and member of the BIO5 Institute. The team is working with Tech Launch Arizona, the office of the UA that commercializes inventions stemming from research, to patent their device and is investigating commercial pathways to bring the innovation to the public.

Quantum Superposition

In classical computing, information is stored as either 0s or 1s, the same way a coin must land on either heads or tails. In quantum computing, qubits can be stored in both states at the same time - a so-called superposition of states. Think of a coin balanced on its side, spinning so quickly that both heads and tails seem to appear at once.

When qubits are entangled, anything that happens to one qubit affects the other through a principle called nonseparability. In other words, knock down one spinning coin on a table and another spinning coin on the same table falls down, too. A principle called nonlocality keeps the particles linked even if they're far apart - knock down one spinning coin, and its entangled counterpart on the other side of the universe falls down, too. The entangled qubits create a Bell state, in which all parts of a collective are affected by one another.

"This is key, because if you manipulate just one qubit, you are manipulating the entire collection of qubits," Deymier said. "In a regular computer, you have many bits of info stored as 0s or 1s, and you have to address each one of them."

From Light to Sound

But, like a coin spinning on its edge, quantum mechanics are fragile. The act of measuring a quantum state can cause the link to collapse, or decohere - just like how taking a picture of a spinning coin will mean capturing just one side of the coin. That's why qubit states can only be maintained for short periods.

But there's a way around the use of quantum mechanics for data processing: Optical scientists and electrical and computer engineering researchers have demonstrated the ability to create systems of photons, or units of light, that exhibit nonseparability without nonlocality. Though nonlocality is important for specific applications like cryptography, it's the nonseparability that matters for applications like quantum computing. And particles that are nonseparable in classical Bell states, rather than entangled in a quantum Bell state, are much more stable.

The materials science and engineering team has taken this a step further by demonstrating for the first time that that classical nonseparability can be applied to acoustic waves, not just light waves. They use phi-bits, units made up of quasi-particles called phonons that transmit sound and heat waves.

"Light lasers and single photons are part of the field photonics, but soundwaves fall under the umbrella of phononics, or the study of phonons," Deymier said. "In addition to being stable, classically entangled acoustic waves are easy to interact with and manipulate."

Complex Science, Simple Tools

The materials to demonstrate such a complex concept were simple, including three aluminum rods, enough epoxy to connect them and some rubber bands for elasticity.

Researchers sent a wave of sound vibrations down the rods, then monitored two degrees of freedom of the waves: what direction the waves moved down the rods (forward or backward) and how the rods moved in relation to one another (whether they were waving in the same direction and at similar amplitudes). To excite the system into a nonseparable state, they identified a frequency at which these two degrees of freedom were linked and sent the waves at that frequency. The result? A Bell state.

"So, we have an acoustic system that gives us the possibility creating these Bell states," Deymier said. "It's the complete analog to quantum mechanics."

Demonstrating that this is possible has opened the door to applying classical nonseparability to the emerging field of phononics. Next, the researchers will work to increase the number of degrees of freedom that can be classically entangled - the more, the better. They also want to develop algorithms that can use these nonseparable states to manipulate information.

Once the system is refined, they plan to resize it from the tabletop down to the microscale, ready to deploy on computer chips in data centers around the world.

Credit: 
University of Arizona College of Engineering

Discovery of tanycytic TSPO inhibition as a potential therapeutic target for obesity treatment

image: Professor Eun-Kyoung Kim in the DGIST Department of Brain and Cognitive Sciences (left) and Seolsong Kim, an integrated M.S.-Ph.D. program student (right)

Image: 
DGIST

DGIST announced on Sept. 5 that Professor Eun-Kyoung Kim (Director of Neurometabolomic Research Center) discovered new targets to prevent and treat high-fat diet-induced obesity. This research achievement is expected to propose a new direction for developing obesity treatment.

Due to westernized eating habits in today's society, the prevalence of metabolic diseases such as obesity and diabetes has markedly increased. To prevent and treat these diseases, it is important to decrease appetite and increase energy consumption. However, specific mechanisms for effective treatment of metabolic disease has not been elucidated yet.

Based on the knowledge that tanycytes, which connects cerebral ventricle1 and hypothalamus2, detect nutrients in food and control appetite, Dr. Eun-Kyoung Kim's research team found that 'TSPO (translocator protein),' which is a mitochondrial protein in the tanycytes, responds to overnutrition signal and control lipid and energy metabolism. The research team also announced an interesting result that TSPO inhibition increases energy expenditure in the body and decreases appetite.

The research team observed that excessive lipid droplets, the major cellular organelles for the storage of neutral lipids, are accumulated in tanycytes of high-fat diet-induced obese mice. In those mice, inhibition of tanycytic TSPO induced lipophagy, one type of autophagy, breaking down lipid droplets to use them as an energy source via modulating energy homeostasis. As a result, the production of Adenosine Triphosphate (ATP)3 that plays essential roles in cellular energy metabolism was increased. Furthermore, food intake was reduced and energy expenditure was elevated, leading to weight loss when tanycytic TSPO was inhibited in high-fat diet-induced obese mice.

Dr. Kim said clarifying the role of tanycytic TSPO that controls lipophagy helps to understand the functions of these cells in overnutrition state. It implies the possibility to use tanycytic TSPO as a potential therapeutic strategy for metabolic diseases such as obesity."

Credit: 
DGIST (Daegu Gyeongbuk Institute of Science and Technology)

Compound may play role in halting panceatic cancer

In early test tube and mouse studies, investigators at Johns Hopkins Medicine and the Johns Hopkins Kimmel Cancer Center have found that nonmuscle myosin IIC (MYH14), a protein activated in response to mechanical stress, helps promote metastatic behavior in pancreatic cancer cells, and that the compound 4-hydroxyacetophenone (4-HAP), known to stiffen myosin IIC-containing cells, can send it into overdrive, overwhelming the ability of cells to invade nearby tissue.

The work, described online in July in the journal Cancer Research, found that 4-HAP reduced metastatic tumor formation in a mouse model of human pancreatic cancer by assessing the fraction of liver surface covered by tumor tissue. The researchers say their results suggest that targeting MYH14 and similar cytoskeletal proteins with 4-HAP is a potentially novel strategy for improving the survival of pancreatic cancer patients, and could eventually become part of a combination strategy with chemotherapy and/or immunotherapy agents.

Much of the focus in developing new cancer drugs involves trying to inhibit a process or protein of interest, says senior study author Douglas N. Robinson, Ph.D., a professor of cell biology and oncology at Johns Hopkins. But some nonmuscle myosin II proteins have tumor-suppressive activities, so inhibiting them could increase rather than decrease the likelihood of metastasis.

"There are two basic ways to stop the runaway train that is cancer metastasis. One is to throw up a roadblock in front of it, like an inhibitor. Or, we can get behind the train and push it faster and shove it off the tracks, and that's kind of what we're doing with 4-HAP," he says. "We're taking the system and shoving it to the right instead of the left, and thereby helping to halt the progression of disease-like behavior, such as invasion and metastasis."

The researchers first identified the proteins of interest by studying how cells change their shape in response to altering mechanical inputs, much like cancer cells will face as they navigate different mechanical environments as they metastasize, explains lead study author Alexandra Surcel, Ph.D., a research associate faculty member in Robinson's lab. The investigators made some predictions about which proteins would be involved in the mechanobiome, a network of proteins that defines cells' mechanical properties, senses and generates forces, and integrates mechanical with chemical cues. They studied seven proteins -- nonmuscle myosin IIA, IIB and IIC; alpha-actinin 1 and 4; and filamin A and B -- looking at a range of model systems, such as cell lines and human cancer patient tissue samples, to test the concept that overwhelming the response of some of these proteins could be harnessed as a tool to return invasive cells to a more stable, less active, noninvasive state.

In a series of experiments, investigators found that these so-called mechanoresponsive proteins that accumulate in response to mechanical stress are upregulated or over-produced in human pancreatic cancer tissue samples and cell lines, and that these proteins directly impact cell mechanics. Highly mechanoresponsive proteins, including nonmuscle myosin IIA and IIC, α-actinin 4 and filamin B, increased in expression in human pancreatic cancer samples compared to levels found in healthy tissue. In contrast, non-mechanoresponsive proteins -- nonmuscle myosin IIB, α-actinin 1 and filamin A -- had smaller changes in expression or disappeared with cancer progression.

The team also found that myosin IIC, while low in overall abundance in pancreatic cancer cells, had a profound impact on cell architecture, movement, behavior and mechanics. Exposing cells to 4-HAP increased myosin IIC assembly, stiffening cells. The group then tested 4-HAP as a treatment in a mouse model of pancreatic cancer, finding that the compound led to a 50% reduction in metastases to the liver in mice transplanted with human pancreatic tissue.

Modulating mechanoresponsive proteins like myosin IIC has several advantages over other cancer treatment strategies, says Robinson. First, scientists can fine-tune the activity of proteins that are upregulated in cancerous tissue, harnessing cells' natural protein makeup to revert them to more normal types, while also protecting healthy cells that do not upregulate the targeted proteins. Second, this strategy draws upon the normal biochemistry of the protein to overwhelm the mechanics of the system.

"There's a misconception that proteins that make up the mechanobiome do not make good targets for pharmacological development for cancer patients," adds Surcel. "That overlooks an entirely new targetable drug space, and our work on 4-HAP really demonstrates the viability of pursuing this class of proteins that have been relegated to the wayside."

According to the most recent figures from the National Cancer Institute, there are an estimated 73,554 people living with pancreatic cancer in the United States, according to the National Cancer Institute. Pancreatic cancer represents 3.2% of all new cancer cases in the U.S. Just 9.3% of patients survive five years or more after diagnosis.

The laboratory is working on different versions of 4-HAP that could be tested on animals, Surcel says. Early evidence from Johns Hopkins and other labs indicates that 4-HAP also may be helpful in the treatment of colorectal and breast cancers.

Credit: 
Johns Hopkins Medicine

Over 14% efficiency for ternary organic solar cell with 300 nm thick active layer

image: The J-V characteristics for OSCs based on PBDB-T-2Cl: BTP-4F, PBDB-T-2Cl: PC61BM and PBDB-T-2Cl: BTP-4F: PC61BM; the chemical structures of active layer components.

Image: 
©Science China Press

Organic solar cells (OSCs) have drawn great attention due to their advantages of making large area and flexible solar panels through the low-cost solution coating methods. Recently, the single-junction OSCs with over 16% power conversion efficiency (PCE) have been reported. However, photovoltaic performance of these cells is very sensitive to the variation in the active layer thickness, which has been recognized as a big challenge for practical application of OSCs.

The photovoltaic performance of OSCs is determined by open-circuit voltage (VOC), short-circuit current density (JSC) and fill factor (FF). For the current high efficiency non-fullerene-based system, the efficiency of OSCs usually shows a sharp drop in FF upon increasing the thickness of the active layer. Such FF drops are generally caused by poor and imbalanced charge transport, which results in enhanced bimolecular charge recombination and the formation of space charge in thicker films.

Very recently, Professor Jianhui Hou's group in Institute of Chemistry Chinese Academy of Sciences demonstrated a thick-film (300 nm) ternary OSC with a power conversion efficiency of 14.3%. This excellent photovoltaic performance is achieved by introducing phenyl-C61-butyric-acid-methyl ester (PC61BM) into a PBDB-T-2Cl: BTP-4F host blend. They found that the addition of PC61BM is helpful for improving the hole and electron mobilities, and thus facilitates charge transport in the thick active layers, leading to the improved efficiencies of OSCs. Their results illustrate that introducing fullerene derivative as a third component is a facile and effective strategy to realize efficient thick-film OSCs.

Credit: 
Science China Press

New lithium battery design could mean lighter, safer batteries for Soldiers

image: Gleb Yushin, a professor in Georgia Tech's School of Materials Science and Engineering and Kostiantyn Turcheniuk, research scientist in Yushin's lab, inspect a battery using a new cathode design that replaces expensive metals and traditional liquid electrolyte with lower cost transition metal fluorides and a solid polymer electrolyte. The research was funded by the U.S. Army.

Image: 
Allison Carter, Georgia Tech

RESEARCH TRIANGLE PARK, N.C. -- Less expensive, lighter and safer batteries are a vital need for warfighters; a new Army project may offer a solution.

The growing popularity of lithium-ion batteries in recent years has put a strain on the world's supply of cobalt and nickel -- two metals integral to current battery designs -- and sent prices surging.

In a bid to develop alternative designs for lithium-based batteries with less reliance on those scarce metals, researchers at the Georgia Institute of Technology, funded by Army Research Office, have developed a promising new cathode and electrolyte system that replaces expensive metals and traditional liquid electrolyte with lower cost transition metal fluorides and a solid polymer electrolyte.

The Army Research Office is an element of the U.S. Army Combat Capabilities Development Command's Army Research Laboratory, the Army's corporate research laboratory.

"Electrodes made from transition metal fluorides have long shown stability problems and rapid failure, leading to significant skepticism about their ability to be used in next generation batteries," said Gleb Yushin, a professor in Georgia Tech's School of Materials Science and Engineering. "But we've shown that when used with a solid polymer electrolyte, the metal fluorides show remarkable stability -- even at higher temperatures -- which could eventually lead to safer, lighter and cheaper lithium-ion batteries."

In a typical lithium-ion battery, energy is released during the transfer of lithium ions between two electrodes -- an anode and a cathode, with a cathode typically comprising lithium and transition metals such as cobalt, nickel and manganese. The ions flow between the electrodes through a liquid electrolyte.

"Professor Yushin has identified a novel approach to enable the use of Iron Fluoride cathodes and addresses issues with dimensional changes and parasitic side reactions to develop Lithium batteries," said Dr. Robert Mantz, division chief, electrochemistry, Army Research Office. "Soldier-wearable technologies are expected to increase significantly, as will the need for power and energy sources to operate them. This research could make battery power more readily available to Soldiers in a form that is safe and easily transportable."

For the study, published in the journal Nature Materials, the research team fabricated a new type of cathode from iron fluoride active material and a solid polymer electrolyte nanocomposite. Iron fluorides have more than double the lithium capacity of traditional cobalt- or nickel-based cathodes. In addition, iron is 300 times less expensive than cobalt and 150 times less costly than nickel.

To produce such a cathode, the researchers developed a process to infiltrate a solid polymer electrolyte into the prefabricated iron fluoride electrode. They then hot pressed the entire structure to increase density and reduce any voids.

Two central features of the polymer-based electrolyte are its ability to flex and accommodate the swelling of the iron fluoride while cycling and its ability to form a very stable and flexible interphase with iron fluoride. Traditionally, that swelling and massive side reactions have been key problems with using iron fluoride in previous battery designs.

"Cathodes made from iron fluoride have enormous potential because of their high capacity, low material costs and very broad availability of iron," Yushin said. "But the volume changes during cycling as well as parasitic side reactions with liquid electrolytes and other degradation issues have limited their use previously. Using a solid electrolyte with elastic properties solves many of these problems."

The researchers then tested several variations of the new solid-state batteries to analyze their performance over more than 300 cycles of charging and discharging at elevated temperature of 122 degrees Fahrenheit, noting that they outperformed previous designs using metal fluoride even when these were kept cool at room temperatures.

The researchers found that the key to the enhanced battery performance was the solid polymer electrolyte. In previous attempts to use metal fluorides, it was believed that metallic ions migrated to the surface of the cathode and eventually dissolved into the liquid electrolyte, causing a capacity loss, particularly at elevated temperatures. In addition, metal fluorides catalyzed massive decomposition of liquid electrolytes when cells were operating above 100 degrees Fahrenheit. However, at the connection between the solid electrolyte and the cathode, such dissolving doesn't take place and the solid electrolyte remains remarkably stable, preventing such degradations, the researchers wrote.

"The polymer electrolyte we used was very common, but many other solid electrolytes and other battery or electrode architectures -- such as core-shell particle morphologies -- should be able to similarly dramatically mitigate or even fully prevent parasitic side reactions and attain stable performance characteristics," said Kostiantyn Turcheniuk, research scientist in Yushin's lab and a co-author of the manuscript.

In the future, the researchers aim to develop new and improved solid electrolytes to enable fast charging and also to combine solid and liquid electrolytes in new designs that are fully compatible with conventional cell manufacturing technologies employed in large battery factories.

"The successful implementation of these materials into batteries would enable significant increases in battery safety and weight, thus reducing the weight of batteries required for Soldier power," Mantz said.

Credit: 
U.S. Army Research Laboratory

Quarter of teachers in England report 60-hour working week

One in four teachers work more than 60 hours a week and many work in the evenings, despite successive government promises to reduce their hours, according to a new UCL-led study.

The paper, published today and funded by the Nuffield Foundation, is the first piece of research to look at data from more than 40,000 primary and secondary teachers in England collected between 1992 and 2017.

The findings show that teachers work around 47 hours per week on average during term-time. This includes the time they spend on marking, lesson planning and administration, with there being little change in this figure over time. In the summer term the average working week was nearer to 50 hours.

Additionally, teachers in England worked on average eight hours more a week compared to teachers in comparable industrialised OECD countries. For example, in 2018, while the average full-time secondary teacher in England worked 49 hours per week the OECD average was 41 hours. The equivalent figure for teachers in Finland was just 34 hours.

The study found that around 40% of teachers in England usually work in the evening, 10% usually work at the weekend. Full-time secondary teachers also said they spend almost as much time on management, administration, marking and lesson planning each week (20.1 hours) as they do actually teaching pupils (20.5 hours).

Lead author, Professor John Jerrim (UCL Institute of Education) said: "This is the first study to attempt to track the working hours of teachers over such a long period of time.

"Successive secretaries of state for education have made big commitments to teachers about their working hours - how they are determined to reduce the burden of unnecessary tasks and how they will monitor hours robustly.

"Our data show just how difficult it is to reduce teacher workload and working hours. It is early days in terms of judging the effectiveness of the policies put forward over the past year. We'd like to see much closer monitoring of teachers' working hours, so that the impact of policy can be assessed as soon as possible.

"Overall, bolder plans are needed by the Government to show they are serious about reducing working hours for teachers and bringing them into line with other countries."

Researchers based their analysis upon four data sources: the Labour Force Survey, the Teaching and Learning International Survey, the UK Time-Use diaries and information gathered from the Teacher-Tapp survey app.

Together, these allowed the researchers to compare the working hours of teachers in England to other countries and to investigate change in working hours over time. They were also able to explore how the working hours of teachers vary over the academic year and throughout a regular school day.

The paper highlights that the current methods used by the Government to collect data and working hours are not as reliable as they could be and should be reformed. Researchers believe response rates are low and the absence of diary method data collection means it adds little value over other routinely collected data sources.

Josh Hillman, Director of Education at the Nuffield Foundation, said: "Earlier this year the Government's teacher recruitment and retention strategy acknowledged the teacher supply crisis in England. This research adds to our understanding of this crisis by confirming that teachers are working persistently long hours. This has been the case for over two decades, despite a succession of policy announcements during this period.

"As previous Nuffield-funded work has shown, addressing teachers' working hours is key to the improvement of both teaching quality and supply. Taking a wider view of the health of teachers over the past 25 years, the next phase of the project will help us to gain an even better understanding of the teacher workforce."

Credit: 
University College London

A large study indicates how cities can promote walking for travel

How to design cities that encourage physical activity among the citizens? Coinciding with the European Mobility Week, the Barcelona Institute for Global Health (ISGlobal), an institution supported by "la Caixa", has published a study describing the urban characteristics that encourage people to choose walking for travel, instead of a motorized vehicle, with the benefits this entails: more physical activity and less air pollution, which result in improved health.

Lack of physical activity is among the 10 leading risk factors for mortality, worldwide. The World Health Organisaton (WHO) recommends adults to do at least 150 minutes of moderate-intensity physical activity -which includes walking- or 75 minutes of vigorous-intensity activity throughout the week.

The study, published in the Environmental Health Perspectives journal, forms part of the PASTA (Physical Activity through Sustainable Transport Approaches) project and was performed with data from 7,875 adults from seven European cities: Antwerp (Belgium), Barcelona (Spain), London (United Kingdom), Örebro (Sweden), Rome (Italy), Vienna (Austria) and Zurich (Switzerland).

The participants answered an online questionnaire on their walking habits: how many hours a week they walked, their criteria to choose a specific transport mode, or the availability of a motorized vehicle or bicycle, among others. The researchers also used public geographic information to collect data on the built environment in which the participants live and work or study.

Mireia Gascon, ISGlobal researcher and first author of the study, stresses that "this is the first study to address not only the built environment at the residence, but also that of the workplace or study place, providing a more accurate picture of the environment people are exposed to".

Walking and Public Transport

The results show that people that walk the most are those that live in areas with good public transport service and a higher density of households, services and installations. In fact, living in this type of urban environment was associated with a 12% increase in minutes walked every week, as compared to people living in other environments. Although lower, this association was also observed with the workplace or study place.

The participants that most valued safety, privacy and lower exposure to air pollution were those that walked more minutes per week. On the contrary, those with a high education level and access to a car were those that walked the least. People that did not work or study walked 65% more minutes per week, as compared to those that worked full-time.

On average, participants from Barcelona were those that walked the most (259 minutes per week), while those from Antwerp walked the least (50 minutes per week) due to the high use of bicycles in this city.

"Although walking is an easy and healthy way of achieving the recommended levels of physical activity, the growing use of motorised vehicles has contributed to decreasing levels of physical activity in the general population, and has generated health additional problems related with traffic, such as air pollution and noise", explains Gascon.

Mark Nieuwenhuijsen, study coordinator and director of the Urban Planning, Environment and Health Initiative, underlines that the results "support previous studies on the role of urban planning and transport in promoting walking, and provides new information to help achieve sustainable, healthy and liveable cities, in accordance with the Sustainable Development Goals (SDGs)". These strategies include "improving the nearby residential (and work/study) built environment with a good public transport service and a diverse offer of facilities," he adds.

Credit: 
Barcelona Institute for Global Health (ISGlobal)

NASA's Terra Satellite sees the birth of Tropical Storm Imelda

image: On Sept. 17, 2019 at 1:30 p.m. EDT (17:30 UTC), the MODIS instrument that flies aboard NASA's Terra satellite showed newly formed Tropical Depression 11 just after it made landfall along the Texas coast.

Image: 
NASA Worldview

NASA's Terra satellite passed over the western Gulf of Mexico during the early afternoon of Sept. 17 and captured a visible image of the newly formed Tropical Depression 11.

The eleventh tropical depression developed during the late morning of Sept. 17. Soon afterward it briefly strengthened into a tropical storm and was re-named Imelda. Then Imelda made landfall near Freeport, Texas. A Tropical Storm Warning was in effect from Sargent to Port Bolivar, Texas.

On Sept. 17 at 1:30 p.m. EDT (17:30 UTC), the Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Terra satellite provided a visible picture of the storm shortly after it made landfall on the southeastern Texas coastline. The storm appeared to be slightly elongated, and when Terra passed overhead the western quadrant of the storm was already over land while the eastern half was over the western Gulf of Mexico.

At 1 p.m. EDT (1700 UTC), the center of Tropical Storm Imelda was located near latitude 28.7 North, longitude 95.4 West. The storm is moving toward the north near 7 mph (11 kph), and this general motion is expected to continue through early Wednesday. Maximum sustained winds are near 35 mph (55 kph), tropical storm strength, with higher gusts. Some slight strengthening is possible before the center moves onshore. The estimated minimum central pressure is 1009 millibars.

For local Houston area radar, visit: https://www.weather.gov/hgx/

A north-northwestward motion is expected Wednesday night and Thursday. On the forecast track, the center of the depression will move inland over the Upper Texas coast later today, and move farther inland tonight and Wednesday.

Credit: 
NASA/Goddard Space Flight Center

Microbiome may be involved in mechanisms related to muscle strength in older adults

image: New study identifies differences in gut microbiome composition in physically high-functioning vs low-functioning older adults, successfully transfers some of these effects into mice.

Image: 
Pixabay

BOSTON (Sept. 18, 2019)--A novel new study suggests that the gut microbiome has a role in mechanisms related to muscle strength in older adults. The work, led by researchers at the Jean Mayer USDA Human Nutrition Research Center on Aging (HNRCA) at Tufts, is available as a pre-proof in advance of print in Experimental Gerontology.

The gut-muscle axis, or the relationship between gut microbiota and muscle mass and physical function, has gained momentum as a research topic in the last few years as studies have established that gut microbiota influences many aspects of health. While researchers have begun exploring the connection between the gut microbiome, muscle, and physical function in mice and younger adults, few studies have been conducted with older adults.

To gain insight into this population, the researchers compared bacteria from the gut microbiomes of 18 older adults with high-physical function and a favorable body composition (higher percentage of lean mass, lower percentage of fat mass) with 11 older adults with low-physical function and a less favorable body composition. The small study identified differences in the bacterial profiles between the two groups.

Similar bacterial differences were present when mice were colonized with fecal samples from the two human groups, and grip strength was increased in mice colonized with samples from the high-functioning older adults, suggesting a role for the gut microbiome in mechanisms related to muscle strength in older adults.

Specifically, when compared to the low-functioning older adult group, the researchers found higher levels of Prevotellaceae, Prevotella, Barnesiella, and Barnesiella intestinihominis--all potentially good bacteria--in the high-functioning older adults and in the mice that were colonized with fecal samples from the high-functioning older adults.

No significant differences in body composition or endurance capacity were observed in the colonized mice; however, the researchers note that the length of the intervention period was short and these data may warrant further study.

"While we were surprised that we didn't identify a role for the gut microbiome on the maintenance of body composition, with these results we now start to understand the role of gut bacteria in the maintenance of muscle strength in older adults," said Michael Lustgarten, last and corresponding author on the study and a researcher in the Nutrition, Exercise Physiology & Sarcopenia (NEPS) Laboratory at the HNRCA. "For example, if we were to conduct an intervention to increase Prevotella levels in the gut microbiome, we would expect to see an increase in muscle strength if these bacteria are involved. Prevotella's role in the maintenance of muscle strength in older adults is one area we expect to continue to explore."

"As we age, body composition, muscle strength, and lean mass all decrease," said first author Roger Fielding, director of the NEPS Laboratory at the HNRCA. "Identifying differences in bacteria present in the high-functioning and low-functioning groups in this study moves us toward a fuller understanding of both the gut microbiome and healthy aging."

For the study, the researchers measured lower extremity function, mobility, and strength in the sedentary older adult group (ages 70-85) at the first and one-month study visits. In the mice, they measured body composition with quantitative magnetic resonance imaging, and grip strength and treadmill endurance capacity to test physical function. Fecal samples from the older adults were transplanted into young, gender-matched germ-free mice. Four weeks after fecal transfer, the researchers measured body composition, physical function, and gut microbiome in the 18 mice colonized with fecal samples from the high-functioning human group and the 18 mice colonized with fecal samples from the low-functioning human group.

The authors note the small sample size and brief time period as potential study limitations.

Credit: 
Tufts University, Health Sciences Campus

Comparing effectiveness of 2 surgical methods for uterine prolapse

Bottom Line: Uterine prolapse happens when weakened muscles and ligaments no longer provide enough support for the uterus, which then protrudes into or out of the vagina. This randomized clinical trial compared the effectiveness of two surgical methods to treat women: a vaginal hysterectomy to remove the uterus with ligament suspension to support remaining tissue or uterus-sparing suspension techniques, known as hysteropexy. A previous review of trials comparing these techniques didn't find one superior to the other. This study included 175 postmenopausal women with uterine prolapse who had a mesh-augmented repair with the uterus remaining in place (transvaginal mesh hysteropexy) or vaginal hysterectomy (removal of the uterus) with ligament suspension. Researchers report that the vaginal mesh hysteropexy was 12 percentage points better than the hysterectomy procedure at three years for the composite outcome that combined retreatment of prolapse, prolapse beyond the hymen or prolapse symptoms, but this was not quite statistically significant. Further research (including continued follow-up in this trial) is needed to assess whether vaginal mesh hysteropexy is superior. A potential limitation of the study is the inclusion of only postmenopausal women because rates of pain or sexual pain could be different in younger patients.

Authors: Charles W. Nager, M.D., UC San Diego Health, San Diego, and coauthors

(doi:10.1001/jama.2019.12812)

Editor's Note: The article includes conflict of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, financial disclosures, funding and support, etc.

Credit: 
JAMA Network

Every step a cell takes, every move they make -- scientists will be watching

image: Microfluidic chip for long-term culturing and time-lapse imaging of embryonic stem cells. The scale bar represents 5 mm.

Image: 
Ben-Yakar Lab at UT Austin

WASHINGTON, D.C., September 17, 2019 -- An interdisciplinary team has found a solution to a problem plaguing developmental biology -- long-term cell tracking and manipulation.

Mechanical engineer Adela Ben-Yakar, at the University of Texas at Austin, collaborated with stem cell biologist Joshua Brickman, from the University of Copenhagen in Denmark, to painstakingly develop an automated microfluidic device for the stable imaging of mice embryonic stem cells over a three-day period. They describe the device and its utility in Biomicrofluidics, from AIP Publishing.

"Looking at single cells and seeing how, depending on their initial state, they respond, enables us to model the differentiation processes of stem cells and better control them," said Ben-Yakar.

To understand how people develop from embryos and what can go wrong at those early stages, developmental biologists study the mother of all cells -- stem cells. By differentiation, stem cells give rise to all the different types of cells in our bodies, from the cells in our responsive retinas to those in tough toenails.

But stem cells are a diverse bunch, dynamically turning genes on and off in a pattern that can be unique from their neighbors. To really understand cause and effect -- why this or that stimulus results in a specific cellular response -- each stem cell requires separate examination. High-resolution microscopy is perfect for individually tracking cells and their cellular players, but it is technically challenging to perform long-term imaging while manipulating the temperamental stem cells.

Existing two-layer microfluidic devices have control valves in the lower layer that restrict cell culture to the upper layer. This introduces a gap between cells and the thin glass substrate, which makes high-resolution microscopy impossible. But the team designed new 3D ports with upper layer valves controlling fluid flow between layers. This enabled cells to be cultured directly on the glass of the lower layer for optimal imaging.

Ben-Yakar explained it was a long-iterative process to find the optimum conditions for cell growth within such a low-volume, specialized environment. Eventually, they settled on the timing for nutrient broth exchange, which was not too regularly. Otherwise, cells can't signal to one another and don't grow.

They also looked at how fast fluid exchange could occur without stressing the cells. Once the team were able to reliably and repeatedly culture healthy stem cells, they demonstrated the high-resolution imaging of fluorescent markers of stem cell stability.

"For me, it's very important to develop a user friendly platform that can be easily deployed in biology laboratories," said Ben-Yakar, who started up a company -- Newormics -- in 2016, to advance the robustness and reliability of microfluidic devices. Next on Ben-Yakar's list is to simplify the valve-control software of the new device and get it ready to send out to collaborators and to be used to better understand differentiation.

Credit: 
American Institute of Physics

Kaleidoscope mirror symmetry inspires new design for optical tools, technologies

image: Transverse components of the normalized Poynting vectors in the focal plane of the tightly focused kaleidoscope-structured vector optical fields. The direction of the transverse energy flow is shown by the white and black arrows. The transverse energy flow shows six uniform tentacles (top left) and patterns like six spanners (top right), which are useful to trap and transport the particles.

Image: 
Yue Pan/Qufu Normal University

WASHINGTON, D.C., September 17, 2019 -- In a kaleidoscope, mirrors are placed at angles to create a visual illusion of multiple, symmetric images from one original object. The number of symmetric axes in the kaleidoscope depends on the number of mirrors and angles inside.

Drawing inspiration from this multiple-axis symmetry, researchers have discovered a new method for creating mirror-symmetric axes in the polarizations of light, which allows for complex manipulations that are useful in optical tools and technologies.

For example, optical machining, photodetectors, optical cages and microscopy are all tools that rely on vector optical fields of varying degrees of complexity. Many of them use mirror symmetry in their polarization states and focal manipulation.

In the paper, published in APL Photonics, from AIP Publishing, the researchers start with a cylindrical vector optical field and introduce a kaleidoscope structure to the polarization states by assigning a parameter for mirror-symmetric axes. That new parameter, which offers an additional degree of freedom, is scheduled into the transmission function on a holographic grating in a spatial light modulator to generate new vector optical fields.

"Mirror symmetry already exists in vector optical fields," author Yue Pan said. "However, no research has systematically studied the arbitrary symmetry properties of the original vector optical fields and spin angular momentum and energy flow in the focal plane. We first propose the kaleidoscope-structured vector optical fields with arbitrary symmetry, and gain spin angular momentum and energy flow with good properties and applications."

By implementing multiple mirror symmetry axes into the design, they were able to create tightly focused fields in various useful shapes, including a subwavelength flattop sharp line that can be used in optical storage and lithography, and various cross, gear and hexagon shapes with tentacles and spanners that are useful for optical trapping. They were also able to introduce the elliptical polarization into the design of the new vector fields, which Pan said could help to further control the design and generation of kaleidoscope-structured vector optical fields.

Author Hui-Tian Wang, leader of the group, said they plan to study a theory that can accurately predict and manipulate the symmetry properties of tightly focused optical fields, propose new kinds of vector optical fields with novel orbital angular momentum, and further study the spin-orbital angular momentum conversion and coupling.

Credit: 
American Institute of Physics

Miniaturizing medical imaging, sensing technology

image: Scientists have used a microchip to map the back of the eye for disease diagnosis. The interference technology used in the microchip has been around for a little while. This is the first time technical obstacles have been overcome to fabricate a miniature device able to capture high quality images.

Image: 
Columbia University.

WASHINGTON, D.C., September 17, 2019 -- Scientists in Christine Hendon's and Michal Lipson's research groups at Columbia University, New York, have used a microchip to map the back of the eye for disease diagnosis.

The interference technology, like bat sonar but using light instead of sound waves, used in the microchip has been around for a little while. This is the first time that technical obstacles have been overcome to fabricate a miniature device able to capture high quality images.

Ophthalmologists' current optical coherence tomography (OCT) devices and surveyors' light detection and ranging (LIDAR) machines are bulky and expensive. There is a push for miniaturization in order to produce cheap handheld OCT and LIDAR small enough to fit into self-driving cars.

In AIP Photonics, by AIP Publishing, the team demonstrates their microchip's ability to produce high contrast OCT images 0.6 millimeters deeper in human tissue.

"Previously, we've been limited, but using the technique we developed in this project, we're able to say we can make any size system on a chip," said co-author Aseema Mohanty. "That's a big deal!"

Author Xingchen Ji is similarly excited and hopes the work receives industry funding to develop a small, fully integrated handheld OCT device for affordable deployment outside of a hospital in low resource settings. Clearly seeing the advantages of miniaturization in interference technologies, both the National Institute of Health and U.S. Air Force funded Ji's project.

Central to chip-scale interferometer is fabrication of the tunable delay line. A delay line calculates how light waves interact, and by tuning to different optical paths, which are like different focal lengths on a camera, it collates the interference pattern to produce a high contrast 3D image.

Ji and Mohanty coiled a 0.4-meter Si3N4 delay line into a compact 8mm2 area and integrated the microchip with micro-heaters to optically tune the heat sensitive Si3N4.

"By using the heaters, we achieve delay without any moving parts, so providing high stability, which is important for image quality of interference-based applications," said Ji.

But with components tightly bent in a small space, it's hard to avoid losses when changing the physical size of the optical path. Ji previously optimized fabrication to prevent optical loss. He applied this method alongside a new tapered region to accurately stitch lithographic patterns together - an essential step for achieving large systems. The team demonstrated the tunable delay line microchip on an existing commercial OCT system, showing that deeper depths could be probed while maintaining high resolution images.

This technique should be applicable to all interference devices, and Mohanty and Ji are already starting to scale LIDAR systems, one of the biggest photonic interferometry systems.

Credit: 
American Institute of Physics

Research reveals the crucial role of recycling in the evolution of life in our universe

New research by astrophysicists at the University of Kent reveals vital clues about the role recycling plays in the formation of life in our universe.

By investigating the different stages in the life journey of stars and gaining new knowledge about their evolutionary cycle, scientists at the Centre for Astrophysics and Planetary Science have discovered more about a crucial stage in the emergence of life in our Universe. Their research reveals for the first time how matter discarded as stars die is recycled to form new stars and planets.

Scientists have long known that the materials that make up human life were not present during the beginnings of the universe. Elements such as carbon and oxygen form deep inside stars and are released when the stars explode. What has not been clear is what happens to these materials in the vast majority of stars which do not explode and how they are then extracted to contribute to the development of new planets and biospheres.

In their paper 'Numerical simulations of wind-driven protoplanetary nebulae - I. near-infrared emission', which was published by the Royal Astronomical Society on 12 September, Professor Michael Smith and PhD student Igor Novikov have discovered this vital missing link. By carrying out 2-D modelling on their Forge supercomputer, which mapped the pattern of light emitted from stars under different environmental conditions, the research team were able to understand how the material ejected is transferred and mixed with interstellar gas to form new astronomical objects.

For the first time, the physicists simulated the detailed formation of Protoplanetary nebula. These are astronomical objects that develop during a star's late evolution. They modelled the formation of the shell of materials that is released as the star ages. These shells form planetary nebulae, or ring-shaped clouds of gas and dust, which are visible in the night sky.

The study revealed how the gas and energy expelled by stars are returned to the universe, and in what forms. It found that the elements produced by dying stars are transferred through a process of fragmentation and recycled into new stars and planets.

Professor Smith said: 'Initially, we were perplexed by the results of our simulations. We needed to understand what happens to the expelled shells from dying red giants. We proposed that the shells must be temporary, as if they stayed intact life could not exist in our universe and our planets would be unoccupied.

'The shells are not uniform. Most are likely to be cold and molecular. They disintegrate into protruding fingers and so lose their integrity. In contrast, warm atomic shells remain intact. This provides vital clues about how carbon and other materials are transferred and reused within our universe. Our civilisation happens to exist when the generation of recycled material is at its highest. That is probably no coincidence.'

Credit: 
University of Kent