Tech

NASA sees compact Douglas strengthening to a major hurricane

image: NASA-NOAA's Suomi NPP satellite provided forecasters with a visible image of Hurricane Douglas at 5:54 p.m. EDT (2154 UTC) as it moved through the Eastern Pacific Ocean.

Image: 
NASA/NRL

Although a compact storm, hurricane Douglas in the Eastern Pacific is mighty, as it has become the season's first major hurricane. NASA-NOAA's Suomi NPP satellite provided forecasters with an image of Douglas that showed development of an eye as it quickly intensified.

Early on July 22, Douglas was still a tropical storm. By 11 a.m. EDT (1500 UTC) it had strengthened into a hurricane. At 5:54 p.m. EDT (2154 UTC), the Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard Suomi NPP revealed that the storm developed an eye. VIIRS showed that powerful bands of thunderstorms had circled the eye. Bands of thunderstorms were spiraling into the low-level center from the northern and southern quadrants. There was an indication that only a little dry air that earlier affected the storm, remained. What little dry air there was, stretched across the northern portion of the circulation, which was limiting the amount of deep convection wrapping around that part of the eye.

By July 23 at 5 a.m. EDT (0900 UTC), Douglas had rapidly intensified into a major hurricane. Maximum sustained winds have increased to near 120 mph (195 kph) with higher gusts.  Douglas is a category 3 hurricane on the Saffir-Simpson Hurricane Wind Scale. Hurricane-force winds extend outward up to 25 miles (35 km) from the center and tropical-storm-force winds extend outward up to 105 miles (165 km).

The center of Hurricane Douglas was located near latitude 13.1 degrees north and longitude 134.0 degrees west. That is about 1,470 miles (2,365 km) east-southeast of Hilo, Hawaii. Douglas was moving toward the west-northwest near 17 mph (28 kph) and this general motion is expected to continue through Saturday. The estimated minimum central pressure was 967 millibars.

Satellite data on July 23 at 5 a.m. EDT showed Douglas had a ragged but nearly clear eye surrounded by cold cloud tops of minus 70 degrees Celsius (minus 94 degrees Fahrenheit), indicating very powerful thunderstorms.

NHC forecasters note that some additional strengthening is possible on Thursday.  Gradual weakening is forecast to begin by early Friday. Interests in the Hawaiian Islands should monitor the progress of Douglas.

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

Study identifies spread of bee disease via flowers

image: Flowers act as hubs for transmitting diseases to bees and other pollinators.

Image: 
Paige Muñiz

ITHACA, N.Y. - One in 11 flowers carries disease-causing parasites known to contribute to bee declines, according to a Cornell University study that identifies how flowers act as hubs for transmitting diseases to bees and other pollinators.

The study, published July 20 in Nature Ecology and Evolution, also found that one in eight individual bees had at least one parasite.

The study was conducted in field sites in upstate New York, where the researchers screened 2,624 flowers from 89 species and 2,672 bees from 110 species for bee parasites through an entire growing season. They used molecular data to identify five common protozoan (free-living, single-celled) and fungal parasites.

"We know very little about transmission of these diseases," said senior author Scott McArt, assistant professor of entomology in the College of Agriculture and Life Sciences. "Our study shows that transmission can likely occur on a lot of different flowers, and the amount of disease in a community is shaped by both the floral community and the bee community."

The researchers found three main factors - flower abundance, numbers of social bees and bee diversity - played roles in disease transmission.

As the season progresses, the number of flowers goes up. For example, in the fall, flower-laden goldenrod dominates many New York fields. At the same time, the proportion of flowers with parasites goes down, lowering the risk that a bee will pick up a parasite when it visits a flower.

"That has really important conservation implications, because if you want to limit disease spread, just plant a lot of flowers," said McArt, adding that planting flowers also provides food for pollinators. "It's a win-win: If we plant flowers and create a lot of forage, we can also dilute disease."

The study revealed that social bees, such as honeybees and bumblebees, were more likely to be infected with parasites than solitary bee species. The researchers found that later in the season, the number of social bees increases, while bee diversity overall decreases.

And as a general rule, diversity of species lowers the spread of disease.

"Both bee diversity and fewer of the social bees make it less likely for bees [overall] to be infected. That's another win for conservation: if we promote bee diversity, there will be less disease," McArt said. High numbers of infections in the social species may also spill over to infect other species, he said.

Future studies will try to determine whether increased flower abundance cancels out the negative effects of increased numbers of social bees combined with lower overall bee diversity later in the summer.

More study is also needed to understand why social bees are so susceptible to parasites, whether they lack defenses and if they are sharing disease in close colony quarters.

Credit: 
Cornell University

NASA finds strength in new Gulf Tropical Depression 8

image: On July 23 at 4:05 a.m. EDT (0805 UTC), the MODIS instrument aboard NASA's Aqua satellite gathered temperature information about Tropical Depression 8's cloud tops. MODIS found several areas of powerful thunderstorms (red) where temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius).

Image: 
NASA/NRL

NASA's Aqua satellite used infrared light to identify the strongest storms and coldest cloud top temperatures in Tropical Depression 8, spinning in the Gulf of Mexico.

Tropical Depression 8 formed in the Gulf about 530 miles (855 km) east-southeast of Port O'Connor, Texas on July 22 by 11 p.m. EDT.

On July 23 at 4:05 a.m. EDT (0805 UTC), the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Aqua satellite analyzed Tropical Depression 8's cloud tops in infrared light. Infrared data provides temperature information, and the strongest thunderstorms that reach high into the atmosphere have the coldest cloud top temperatures.

Aqua found the most powerful thunderstorms around the center of circulation and areas east of the center, where temperatures were as cold as or colder than minus 70 degrees Fahrenheit (minus 56.6 Celsius). Cloud top temperatures that cold indicate strong storms with the potential to generate heavy rainfall.

At 2 p.m. EDT (1800 UTC) the National Hurricane Center (NHC issued a Tropical Storm Watch from Port Mansfield to High Island, Texas.

At that time, the center of Tropical Depression 8 was located by an Air Force Reserve Hurricane Hunter aircraft near latitude 26.0 degrees north and longitude 90.3 degrees west. That is about 415 miles (665 km) east-southeast of Port O'Connor, Texas. In 15 hours, the storm had moved 115 miles closer to Port O'Connor.

The depression is moving toward the west-northwest near 7 mph (11 kph), and a west-northwestward to westward motion is expected during the next couple of days. Maximum sustained winds are near 35 mph (55 kph) with higher gusts. The latest minimum central pressure reported by the Hurricane Hunter Aircraft is 1008 millibars.

Satellite data at 2 p.m. EDT revealed Tropical Depression 8 is getting better organized, with a better-defined center located near the northeastern end of a broadly curved convective band.

Slow strengthening is expected, and the depression could become a tropical storm during the next 12 to 24 hours. On the forecast track, the center of the depression is expected to move across the northwestern Gulf of Mexico today and Friday and make landfall along the Texas coast on Saturday.

Credit: 
NASA/Goddard Space Flight Center

Preventing the next pandemic

Thus far, COVID-19 has cost at least $2.6 trillion and may cost ten times this amount. It is the largest global pandemic in 100 years. Six months after emerging, it has killed over 600,000 people and is having a major impact on the global economy.

"How much would it cost to prevent this happening again? And what are the principal actions that need to be put in place to achieve this?" asked Andrew Dobson, a professor of ecology and evolutionary biology at Princeton. He and colleague Stuart Pimm of Duke University assembled a team to seek answers.

Their team has now written a Policy Forum article for the journal Science, a research-based opinion piece. In it, the multidisciplinary group of epidemiologists, wildlife disease biologists, conservation practitioners, ecologists and economists argue that an annual investment of $30 billion would pay for itself quickly.

"There have been at least four other viral pathogens that have emerged in the human population so far this century. Investment in prevention may well be the best insurance policy for human health and the global economy in the future," Pimm said.

Two major factors loom large as drivers of emerging pathogens: destruction of tropical forests and the wildlife trade. Each has contributed two of the four emerging diseases that have appeared in the last 50 years: COVID, Ebola, SARS, HIV.

Both deforestation and the wildlife trade also cause widespread damage to the environment on multiple fronts, so there are diverse benefits associated with reducing them, note the researchers. Increased monitoring and policing of these activities would allow future emerging viruses to be detected at a much earlier stage, when control could prevent further spread.

All the credible genetic evidence points to COVID-19 emerging from a bat species traded as food in China. The wildlife trade is a major component of the global economy, with principal economic products including food, medicine, pets, clothing and furniture. Some of these are traded as luxury goods, which can create an intimate association that enhances the risk of pathogen transmission to the merchant or the buyer. Wildlife markets are invariably poorly regulated and unsanitary.

The organization tasked with monitoring international wildlife trade -- the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) -- has a net global budget of "a mere $6 million," said Dobson. "Many of the 183 signatories are several years in arrears in their payments."

The monitoring of this trade needs to be expanded, the authors argue. In particular, scientists need vital information about the viral pathogens circulating in potential food and pet species. They suggest using regional and national wildlife trade monitoring groups, integrated with international organizations for monitoring animal health.

Monitoring and regulating this trade will not only ensure stronger protection for the many species threatened by the trade, it will also create a widely accessible library of genetic samples that can be used to identify novel pathogens when they emerge, say the authors. It will also generate a genetic library of viruses with two key roles: more speedily identifying the source and location of future emerging pathogens, and developing the tests needed to monitor future outbreaks.

Ultimately, this library will contain the information needed to speed the development of future vaccines.

Although there have been calls to close the "wet markets" where wild and domestic animals are sold, to prevent future outbreaks of emerging pathogens, the authors acknowledge that many people are dependent on wild-sourced foods and medicines, and suggest that better health oversight of domestic markets is required.

They suggest that the risk of new viruses emerging can be mitigated if more people are trained in monitoring, early detection and control of pathogens in wildlife trade, and working with local communities to minimize risks of exposure and onward transmission.

"In China, for example, there are too few wildlife veterinarians, and the majority work in zoos and animal clinics," said co-author Binbin Li, an assistant professor of environmental science at Duke Kunshan University in Jiangsu, China.

"Veterinarians are on the front line of defense against emerging pathogens, and globally we desperately need more people trained with these skills," noted Dobson.

The expansion and development of better ways to monitor and regulate the wildlife trade could be done for around $500 million a year, which the authors call "a trivial cost" when compared with the current costs of COVID, especially considering the add-on benefits such as curbing wildlife consumption and sustaining biodiversity.

Slowing tropical deforestation would also slow viral emergence, plus it would reduce carbon inputs into the atmosphere from forest fires and protect forest biodiversity. On the other hand, it reduces revenues from timber, grazing and agriculture.

Is it worth foregoing these tangible, but economically focused, benefits? The authors undertake this part of their cost-benefit analysis from two complementary economic perspectives: first ignoring and then including the benefits of carbon stored as a hedge against climate change. They make no attempt to put a value on the loss of biodiversity.

The Policy Forum article sharply focuses on the bottom-line costs needed to prevent the next COVID.

"Pathogen emergence is essentially as regular an event as national elections: once every 4 to 5 years," said co-author Peter Daszak, an epidemiologist with Ecohealth Alliance in New York, pointing to numerous studies. "New pathogens have appeared at roughly the same rate as new presidents, congressmen, senators and prime ministers!"

"We may see the costs of COVID soar to beyond $8 to $15 trillion with many millions of people unemployed and living under lockdown," said co-author Amy Ando, a professor of agricultural and consumer economics at the University of Illinois-Urbana Champaign.

The annual cost of preventing future outbreaks is roughly comparable to 1 to 2% of annual military spending by the world's 10 wealthiest countries. "If we view the continuing battle with emerging pathogens such as COVID-19 as a war we all have to win, then the investment in prevention seems like exceptional value," Dobson said.

Credit: 
Princeton University

Frequent social media use influences depressive symptoms over time among LGBTQ youth

PULLMAN, Wash. - Frequent social media use can impact depressive symptoms over time for LGBTQ youth, according to research from a Washington State University communication professor.

Traci Gillig, an assistant professor in the Edward R. Murrow College of Communication at Washington State University, found that when LGBTQ adolescents attended a social media-free summer camp, they experienced a reduction in depressive symptoms, as outlined in her 2020 research "Longitudinal analysis of depressive symptoms among LGBTQ youth at a social media-free camp".

According to Gillig, social media use may foster a positive sense of self and a perception of being valued in a society or community, or it may do the opposite, which can affect adolescents' psychological well-being. Youth with more negative emotional or psychological symptoms are at higher risk than their peers of developing problematic online engagement patterns in attempts to ease psychological distress, which can lead to problematic usage patterns for some.

Previous research reveals that nearly half of youth (42%) report that social media has taken away from in-person, face-to-face time with friends in today's digital age. Many also report feelings of social exclusion, which is popularly referred to today as the term FOMO (i.e., "fear of missing out").

In Gillig's recently published study, LGBTQ youth ages 12-18 were surveyed before and after attending a social media-free summer leadership camp for LGBTQ youth. Survey questions examined the relationship between youth's social media use prior to camp and changes in their depressive symptoms during the program.

When examining the role of social media use in changes in depressive symptoms over time, significant findings emerged. Before attending the camp, the average number of hours participants spent using social media each day was about four hours and depressive symptoms among participants was moderate. By the end of the social media-free camp, depressive symptoms lowered by about half.

Youth with the highest levels of pre-camp social media use tended to experience a more "across the board" reduction in depressive symptoms. Gillig believes this can be attributed to the social, affirming camp setting that may have filled a critical need of social interaction for the high-volume social media users.

These findings highlight the positive influence of a "social media break" in a supportive environment on mental health, especially for LGBTQ youth. They also demonstrate the value of face-to-face interactions and how many youth may be unaware of the psychological benefits they could experience by trading social media time for face-to-face interactions in supportive contexts.

Face-to-face interactions can be even more beneficial for marginalized groups, including LGBTQ adolescents, who may not have access to supportive contacts within their local community. Affirming programming that brings together LGBTQ youth for in-person relationship development, such as camps for LGBTQ individuals, shows promise to improve youth mental health trajectories.

Gillig hopes that other researchers continue to test for relationships between social media use and psychological distress, especially its impact on LGBTQ youth mental health over time. More research is needed to help practitioners make informed recommendations to distressed LGBTQ youth and their parents as to whether the youth may benefit from simply unplugging from social media or from unplugging in the context of LGBTQ-affirming programming.

Credit: 
Edward R. Murrow College of Communication

Study finds decline in emergent hospitalizations during early phase of COVID-19

Boston - Early reports have shown the COVID-19 pandemic has resulted in a decline in patients seeking outpatient medical care. Whether and how the pandemic has impacted patients seeking care for emergent conditions - emergent medical, surgical and obstetric hospitalizations - remains unclear, though emerging studies, including one from colleagues at Beth Israel Deaconess Medical Center (BIDMC) demonstrate a reduction in patients seeking care for heart attack, stroke and cancer care.

In a new study published today in the Journal of General Internal Medicine, researchers from BIDMC report on the decline of emergent medical, surgical and obstetric hospitalizations at the medical center during the six-week period following the week of the declaration of the COVID-19 public health emergency in Boston in mid-March 2020. Comparing data from the same period in 2019, the authors found a 35 percent decrease in weekly hospitalizations overall and 45 percent decrease in weekly hospitalizations that were not related to COVID-19.

"Our findings suggest that patients with life-threatening conditions may have been avoiding the hospital in the early wave of COVID-19 which may help explain recent reports of increased mortality from diseases other than COVID-19 during this time," said Timothy Anderson, MD, the study's lead author and a general internist and health services researcher in the Division of General Medicine at BIDMC and Instructor in Medicine at Harvard Medical School. "Continuing to follow trends in mortality and hospital use after the COVID surge will be important for determining if patients who delayed care are now suffering worse health and may help inform wider public health responses to future waves of the epidemic."

The researchers identified all hospital admissions from BIDMC between January 1, 2019, and April 25, 2020. Then, the researchers examined the weekly incidence of overall admissions to emergent medical, surgical, obstetric, and psychiatric services, as well as hospitalizations for COVID-19 in 2020. After conducting a time-series analysis comparing the same six week periods, year against year, the authors found there were significantly fewer weekly hospitalizations for emergent medical conditions. They report a 51 percent decrease in acute medical conditions, such as cardiac arrest or stroke; a 31 percent decrease in acute surgical conditions, such as appendicitis; a 55 percent decrease in chronic disease exacerbations, such as diabetes or asthma and 13 percent decrease in obstetric hospitalizations.

"We are able to see from the data that the number of hospitalizations were down, but it's not clear why. People may have decided not to seek care out of fear of contracting the virus, but it's also possible that some people, such as college students, left Boston at the start of the epidemic, reducing the overall population," said Shoshana Herzig, MD, MPH, Director of Hospital Medicine Research at BIDMC, Associate Professor of Medicine at Harvard Medical School and senior author on the study. "Further studies are needed to determine the impact of the COVID-19 pandemic on long-term outcomes of patients delaying care for acute and chronic conditions."

In addition to Anderson and Herzig, co-authors include Jennifer P. Stevens, Adlin Pinheiro, and Stephanie Li, all of BIDMC.

Credit: 
Beth Israel Deaconess Medical Center

NASA examines Tropical Storm Gonzalo's structural changes

image: NASA's Aqua satellite provided a visible image to forecasters of Tropical Storm Gonzalo in the central North Atlantic Ocean on July 22, 2020.

Image: 
NASA Worldview

Visible and microwave imagery from NASA's Aqua satellite indicated Tropical Storm Gonzalo was slightly less organized than it was on the previous day.

Gonzalo formed in the central North Atlantic Ocean on July 21 and is moving west.

The Moderate Imaging Spectroradiometer or MODIS instrument that flies aboard NASA's Aqua satellite captured a visible image of Gonzalo late on July 22. The image was created by NASA Worldview at NASA's Goddard Space Flight Center in Greenbelt, Md. The July 22 visible image of Gonzalo showed a central dense overcast that had become a bit ragged. The banding of thunderstorms seen in earlier images had dissipated.

Microwave imagery captured at 12:53 a.m. EDT (0452 UTC) on July 23 from the Atmospheric Infrared Sounder or AIRS instrument also aboard Aqua, indicated a small convective ring (rising air that forms the thunderstorms that make up a tropical cyclone) was present under the overcast. At 11 a.m. EDT, the National Hurricane Center noted, "Recently-obtained microwave data from overnight shows that Gonzalo's center is a little farther south than previously estimated."

On July 23, NHC forecaster Robbie Berg noted, "The storm's structure has become a little disheveled since yesterday, with the deep convection losing some organization."

Hurricane Watch in Effect

On July 23, the National Hurricane Center (NHC) issued a Hurricane Watch for Barbados and St. Vincent and the Grenadines.

Gonzalo's Status on July 23

The National Hurricane Center (NHC) said at 11 a.m. EDT (1500 UTC), the center of Tropical Storm Gonzalo was located near latitude 9.6 degrees north and longitude 48.3 degrees west.  That is about 885 miles (1,425 km) east of the Southern Windward Islands Gonzalo is moving toward the west near 14 mph (22 kph). A westward to west-northwestward motion with an increase in forward speed is expected through the weekend.

Maximum sustained winds are near 65 mph (100 kph) with higher gusts. Gonzalo is a small storm, and tropical-storm-force winds extend outward up to 35 miles (55 km) from the center. The estimated minimum central pressure is 997 millibars.

NHC provided two key messages about the storm:

1. There is an increasing risk of wind and rain impacts from Gonzalo in portions of the southern Windward Islands this weekend; however, there is significant uncertainty in how strong Gonzalo will be when it moves across the islands.
2. Despite the uncertainty in Gonzalo's future intensity, hurricane conditions are possible across portions of the southern Windward Islands. Hurricane Watches are currently in effect for Barbados and St. Vincent and the Grenadines, and additional watches for other islands could be required later today. Interests in the southern Windward Islands should monitor the progress of Gonzalo and follow any advice given by local officials.

NHC said, "Some strengthening is forecast during the next couple of days, and Gonzalo could become a hurricane tonight or on Friday. On the forecast track, the center of Gonzalo will approach the southern Windward Islands Friday night and move across the islands Saturday and Saturday evening."

Credit: 
NASA/Goddard Space Flight Center

Different from a computer: Why the brain never processes the same input in the same way

image: How strongly the cortex is excitable by a stimulus (lightning symbol) is not left to chance. Rather, the change between lower and stronger excitability follows a certain temporal pattern (violet line = single brain response as EEG pattern).

Image: 
Stephani/ MPI CBS

Rustling leaves, light rain at the window, a quietly ticking clock - muffled sounds, just above the threshold of hearing. One moment we perceive them, the next we don't, even if we, or the sounds, don't seem to change. Many studies have shown that we never process an incoming stimulus, be it a sound, an image, or a touch, in the same way. This is true, even if the stimulus is exactly the same. This occurs because the impact a stimulus makes, on the brain regions that process it, depends on the momentary state of the networks those brain regions belong to. However, the factors that influence and underlie the constantly fluctuating momentary state of the networks and whether these states are random or follow a rhythm, was previously unknown.

Now, scientists at the Max Planck Institute for Human Cognitive and Brain Sciences (MPI CBS) in Leipzig, Germany, have discovered that the sensitivity of the network state, at the time the stimulus-related information reaches the cerebral cortex, influences how strongly the brain reacts to the stimulus. Depending on the network state, certain nerve cells in this so-called primary somatosensory cortex can be more or less 'excitable', which shapes the following stimulus processing in the brain. This means that the brain's response is modulated already at the entry to the cerebral cortex and depends on more than how the stimulus is evaluated at higher, downstream levels.

"There is always a certain amount of activity between the neurons of a network, even if there are apparently no external influences on us. So, the system is never completely still or inactive," explains Tilman Stephani, PhD student at the MPI CBS and first author of the study, which has now been published in the Journal of Neuroscience. Rather, information is constantly arising, from inside the body about our heartbeat, digestion, or breathing, information about our position in space, the temperature, and internally generated thoughts. In addition, intrinsic neuronal activity occurs even if neuronal networks are isolated from any input. This constantly affects the excitability of various brain networks. "The dynamics from internal processes are thus associated with the system´s excitability and therefore, also the stimulus response," says Stephani. "So, the brain does not seem to function like a computer where the same incoming information always means the same reaction."

It turns out that fluctuations of cortical excitability do not occur completely at random but rather show a certain temporal pattern: The excitability at one moment depends on previous network states and influences subsequent ones. Scientists refer to this as long-range temporal dependency or long-lasting autocorrelation.

The fact that cortical excitability varies in this special way suggests that neuronal networks are poised at a so-called "critical" state, where there is a delicate balance between excitation and inhibition of the network. Earlier theoretical and empirical studies have indicated that such "criticality" may be a fundamental principle underlying brain functioning where information transmission and capacity are maximized. Stephani and colleagues now provide evidence that this principle may govern the variability of sensory brain responses in the human brain, too. Presumably, this serves as an adaptive mechanism of the brain to cope with the variety of information that is constantly arriving from the environment. A single stimulus should neither excite the entire system at once nor fade away too quickly.

However, it is still unknown whether greater excitability leads to a more salient experience. In other words, do the study participants perceive the intensity of the stimuli different depending on the instantaneous excitability? This is now being tested in a second study. "But other processes can also play a role here," explains Stephani. "Attention, for example. If you direct it to something else, a first, strong brain response can still occur. However, higher brain processes downstream may then prevent it from being consciously perceived."

The experiments were carried out by examining the response of participants' brains to thousands of small successive electrical currents. These stimuli were applied to the participants´ forearms to stimulate the main nerve in the arm. This, in turn, produced an initial reaction 20 milliseconds later in a specific area of the brain, the somatosensory cortex. Based on the evoked EEG patterns, they were able to see how easily each individual stimulus excited the brain.

Credit: 
Max Planck Institute for Human Cognitive and Brain Sciences

Silicon core fishbone waveguide extends frequency comb

image: Waveguide design, Zhang et al. doi 10.1117/1.AP.2.4.046001

Image: 
Zhang et al.

Frequency combs are becoming one of the great enabling technologies of the 21st century. High-precision atomic clocks, and high-precision spectroscopy are just two technologies that have benefited from the development of highly precise frequency combs. However, the original frequency comb sources required a room full of equipment. And it turns out that if you suggest that a room full of delicate equipment is perfect for a commercial application, the development engineer makes a beeline for the nearest exit.

These disadvantages would be solved by making chip-based devices that are actually robust enough to withstand the rigors of everyday use. To do that, scientists have to balance material properties with the behavior of light in a waveguide. This balance is easier to engineer in glass, while for applications and integration with existing devices, it would be better to use silicon.

It is difficult to make very wide frequency combs from silicon waveguides, but clever waveguide engineering may be about to make that task a bit easier. Zhang and colleagues, reporting in Advanced Photonics, have shown a way to make a graded index waveguide that allows the width of a frequency comb to be more than doubled (compared to a normal waveguide).

Peak alignment for a broader comb?

A frequency comb is a light spectrum that consists of many very sharply defined frequencies that are equally spaced. A power spectrum looks rather like a comb, hence the name.

Frequency comb generation is a delicate balance between the material properties that allow light to generate new colors of light (referred to as the optical nonlinearity), the configuration of the path the light follows (the optical resonator), and the dispersion (how the speed of light varies with wavelength in the material). The last item, dispersion, is usually the killer, and this is where the work of Zhang and colleagues focuses. To generate a very broad frequency comb, the colors that make up the comb must all stay in phase with each other. Put concretely: if two waves at one point have their peaks lined up, then at some point further along in space and time, those peaks should still line up. But, ordinarily, this never happens, and the peaks slip past each other, preventing any new frequencies from being generated.

Engineering to the rescue

To compensate for the material dispersion, researchers often turn to waveguide engineering. Since waveguides are made of materials, they have dispersion, and the confinement of the waveguide itself introduces another type of dispersion. This dispersion depends on the shape of the waveguide, the dimensions, as well as the materials that are used. This allows engineers to counter material dispersion through their waveguide design.

But, this is tough work in silicon. The silicon core has a large refractive index compared to the glass cladding. The large difference between the two creates a strong dispersion that overcompensates for the material dispersion.

The insight of Zhang and colleagues is that the interface between the glass cladding and the silicon core doesn't have to be sharp. They have designed a waveguide that has a silicon core with a fishbone structure that extends outwards into the glass cladding. The effective refractive index in the mixed region is the average of the glass and silicon, which gradually transitions from silicon to glass: a graded index waveguide.

In the graded index, red colors spread out to occupy a wider area of waveguide, while bluer colors are more tightly confined. The net effect is that the different wavelengths behave as if they are traveling in different width waveguides, while they are actually traveling together in the same waveguide. The researchers refer to this effect as a self-adaptive boundary. They explored different configurations for the fishbone structure. Each configuration increased the wavelength range over which the dispersion was small.

To confirm that their graded index waveguides would result in better frequency combs, the team modeled frequency comb generation in standard and graded index waveguides. They showed that the frequency spectrum was extended from about 20 THz to about 44 THz.

Turn on the light

So far the researchers have only calculated and modeled their structures. However, the proposed structures have all been chosen with fabrication in mind, so once they get their bunny suits, test devices should be on their way. Then silicon frequency combs can really strut their stuff. A good example: silicon is transparent over a broad range of the infrared, which is also the wavelength range needed for spectroscopic identification of molecules. A chip-based frequency comb will enable high precision and high sensitivity compact spectrometers.

Read the original article in the peer reviewed, open access journal Advanced Photonics: J. Zhang et al., Adv. Photonics 2(4), 046001 (2020), doi 10.1117/1.AP.2.4.046001.

Credit: 
SPIE--International Society for Optics and Photonics

Do bicycles slow down cars on low speed, low traffic roads? Latest research says 'no'

The new article Evidence from Urban Roads without Bicycle Lanes on the Impact of Bicycle Traffic on Passenger Car Travel Speeds published in Transportation Research Record, the Journal of the Transportation Research Board, demonstrates that bicycles do not significantly reduce passenger car travel speeds on low speed, low volume urban roads without bicycle lanes. Authored by Jaclyn Schaefer, Miguel Figliozzi, and Avinash Unnikrishnan of Portland State University, the research shows that differences in vehicle speeds with and without cyclists were generally on the order of 1 mph or less - negligible from a practical perspective.

A concern raised by some motorists is that, on urban roads without bicycle lanes, cyclists will slow down motorized vehicles and therefore create congestion. Researchers evaluated speeds on six roads in Portland at different times of day, including peak traffic hours. They did a detailed comparative analysis of the travel speeds of passenger cars on lower volume urban roads without bicycle lanes, and found that a 1 mph differential in speed caused by the presence of a cyclist would not cause congestion.

The study also found that cyclists riding on a downhill road, and therefore traveling faster, were less likely to be overtaken by motorists. In a Forbes article on the research, "Cyclists Don't Cause Congestion: 'Must Get In Front' Maneuvers By Motorists Pointless, Finds Study," Figliozzi agreed that this has possible implications for e-bike riders, who can often travel at faster average speeds than cyclists on standard bicycles.

"[Those on] e-bikes are not as affected by uphills, and have better travel performance regarding speed and acceleration. In a low volume and low-speed street, motorists are less likely to overtake e-bikes because the speed differential is smaller or maybe zero," Figliozzi told Forbes.

This research was first presented by Jaclyn Schaefer at the annual meeting of the Transportation Research Board, and you can view that poster visualization of the research at: https://tinyurl.com/yxlxob4k. Jaclyn is a recent Eisenhower Fellow and NITC Scholar, and is currently wrapping up her studies as a master's student at Portland State University.

"The hope is that our study dissuades policymakers from tossing out shared roadways as a viable option because of the perception that bicyclists will impede the mobility and speed of drivers," Schaefer shared. "While the preference is to separate modes through separated, protected bike lanes - that's not always possible in every urban setting. 'Bike boulevards', or 'neighborhood greenways' as we call them here in Portland, are great alternatives on low-volume, low-speed roads to build out a safe, well-connected bicycle network."

The research team builds on a long legacy of Portland State University research on the case for bike boulevards, as recapped recently by PSU Urban Studies Professor and TREC Director Jennifer Dill: https://jenniferdill.net/2019/06/27/a-case-for-bike-boulevards/.

Due to limitations regarding homogeneity among some site characteristics, this study is currently being expanded to include a large number of sites displaying a more diverse range of functional classifications, roadway markings, speed limits, roadway grades, and traffic volumes and compositions. Additionally, the new study will explore how oncoming traffic speed and volume may affect opportunities for overtaking bicycles, and the potential connection to passenger car speeds on roads without bicycle lanes.

Credit: 
Portland State University

Flood data from 500 years: Rivers and climate change in Europe

video: A visualization of floods in Europe over 500 years.

Image: 
TU Wien

Overflowing rivers can cause enormous problems: Worldwide, the annual damage caused by river floods is estimated at over 100 billion dollars - and it continues to rise. To date it has been unclear whether Europe is currently in a flood-rich period from a long-term perspective.

Austrian flood expert Prof. Günter Blöschl from TU Wien (Vienna) has led a large international study involving a total of 34 research groups that provides clear evidence that the past three decades were among the most flood-rich periods in Europe during the past 500 years, and that this period differs from others in terms of its extent, air temperatures and flood seasonality. Compared to the past, floods tend to be larger in many places, the timing has shifted and the relationship between flood occurrence and air temperatures has reversed. In the past, floods tended to occur more frequently in cold phases, while today, global warming is one of the main drivers of their increase. The results of the study have now been published in "Nature" magazine.

Historical data from half a millennium

"From our previous research, we already knew how climate change has affected Europe's floods in the past 50 years," says Alberto Viglione from the Politecnico di Torino, one of the key authors of the publication. "For forecasts of the next decades, however, it is also important to understand whether this is a completely new situation or whether this is just a repetition of something that has already occurred. So far, the available data had not been sufficient to ascertain whether this is the case or not. We have examined this question in great detail and can now say with confidence: Yes, flooding characteristics in recent decades are unlike those of the previous centuries."

For the study, tens of thousands of historical documents containing contemporary flood reports from the period 1500 to 2016 were analysed. The TU Wien team has worked with historians from all over Europe. "The particular challenge of this study consisted in making the very different texts of the different centuries and different cultural regions comparable," explains Andrea Kiss from the Vienna University of Technology, researcher and historian herself, and one of the key authors of the publication. "We managed to achieve this comparability by putting all the texts in their respective historical contexts with painstaking attention to detail."

Formerly cold, now warm: River floods now function differently

The data analysis identified nine flood-rich periods and associated regions. Among the most notable periods were 1560-1580 (western and central Europe), 1760-1800 (most of Europe), 1840-1870 (western and southern Europe) and 1990-2016 (western and central Europe). Comparisons with air temperature reconstructions showed that these historical flood periods were substantially cooler than intervening phases.

"This finding seems to contradict the observation that, in some areas such as in the northwest of Europe, the recent warmer climate is aligned with larger floods," says Günter Blöschl. "Our study shows for the first time that the underlying mechanisms have changed: While, in the past, floods have occurred more frequently under colder conditions, the opposite is the case now. The hydrological conditions of the present are very different from those in the past."

The timing of the floods within the year has also changed. Previously, 41% of Central European floods occurred in the summer, compared to 55% today. These shifts are related to changes in precipitation, evaporation and snowmelt and are an important indicator for distinguishing the role of climate change from that of other controls such as deforestation and river management.

These findings have been made possible by a new data base compiled by the study authors that includes the exact dating of almost all flood events reported by written sources. So far, one had to often rely on other, less precise sources of information, such as lake sediments. It is the first study worldwide to evaluate historical flood periods for an entire continent in such detail.

Better data - better forecasts

Because of the shift in the flood generating mechanisms, Günter Blöschl advocates the use of tools for flood-risk assessment that capture the physical processes involved, and management strategies that can incorporate the recent changes in risk. "Regardless of the necessary efforts of climate change mitigation, we will see the effects of these changes in the coming decades," says Blöschl. "Flood management must adapt to these new realities."

Credit: 
Vienna University of Technology

Sharks almost gone from many reefs

A massive global study of the world's reefs has found sharks are 'functionally extinct' on nearly one in five of the reefs surveyed.

Professor Colin Simpfendorfer from James Cook University in Australia was one of the scientists who took part in the study, published today in Nature by the Global FinPrint organisation. He said of the 371 reefs surveyed in 58 countries, sharks were rarely seen on close to 20 percent of those reefs.

"This doesn't mean there are never any sharks on these reefs, but what it does mean is that they are 'functionally extinct' - they are not playing their normal role in the ecosystem," said Professor Simpfendorfer.

He said almost no sharks were detected on any of the 69 reefs of six nations: the Dominican Republic, the French West Indies, Kenya, Vietnam, the Windward Dutch Antilles and Qatar.

"In these countries, only three sharks were observed during more than 800 survey hours," said Professor Simpfendorfer.

Dr Demian Chapman, Global FinPrint co-lead and Associate Professor in the Department of Biological Sciences and Institute of Environment at Florida International University, said it's clear the central problem is the intersection between high human population densities, destructive fishing practices, and poor governance.

"We found that robust shark populations can exist alongside people when those people have the will, the means, and a plan to take conservation action," said Dr Chapman.

Professor Simpfendorfer said it was encouraging that Australia was among the best nations at protecting shark populations and ensuring they played their proper role in the environment.

"We're up there along with such nations as the Federated States of Micronesia, French Polynesia and the US. These nations reflect key attributes that were found to be associated with higher populations of sharks: being generally well-governed, and either banning all shark fishing or having strong, science-based management limiting how many sharks can be caught," he said.

Jody Allen, co-founder and chair of the Paul G. Allen Family Foundation which backs the Global FinPrint project, said the results exposed a tragic loss of sharks from many of the world's reefs, but also gave some hope.

"The data collected from the first-ever worldwide survey of sharks on coral reefs can guide meaningful, long-term conservation plans for protecting the reef sharks that remain," she said.

Credit: 
James Cook University

'Seeing' and 'manipulating' functions of living cells

image: (a)Cell membrane perforation of living cells based on highly localized photochemical oxidation with a catalytic TiO2-functionalized AFM probe (b)Intracellular tip-enhanced Raman spectroscopy (TERS) imaging of molecular dynamics in living cells using an AgNP-functionalized AFM probe

Image: 
COPYRIGHT (C) TOYOHASHI UNIVERSITY OF TECHNOLOGY. ALL RIGHTS RESERVED.

Overview

A research group composed of Professor Takayuki Shibata and his colleagues at Department of Mechanical Engineering, Toyohashi University of Technology has given greater functionalities to atomic force microscopy (AFM). Our research team has succeeded in minimally invasive surgery to living cells using photocatalytic oxidation controlled in a nanoscale space and visualizing dynamic information on intracellular biomolecules. This proposed technique for controlling and visualizing the process of cell function expression on a high level has significant potential as a strong nanofabrication and nanomeasurement system to solve the mystery of life.

Details

An integrated understanding of life phenomena and the control thereof are absolutely essential for further development of the medical and pharmaceutical fields. The thesis for creating life innovation is to solve the structure and function of biomolecules such as genomes, proteins, and sugar chains and also solve the function of cells, which are the basic unit for life activity. Therefore, we aim to establish a technology for minimally invasive surgery to target living cells at a molecular level (God's hand to manipulate the function of cells) and visualizing changes in the dynamic behavior of intracellular biomolecules and the state of cell membrane protein at a single molecular level (God's eye to see the function of cells), and thus provide an innovative nanofabrication and nanomeasurement platform to solve the mystery of life.

Here, our research team has succeeded in giving two new functions to atomic force microscopy (AFM)1). The first advancement is to coat the tip apex of an AFM probe with a thin film of titanium oxide (TiO2) known as a photocatalyst. By this method, the photocatalytic reaction is localized in a nanoscale space (100 nm region) in the vicinity of the tip apex to achieve minimally invasive cell membrane perforation. As a result, the probability of cell membrane perforation reaches 100%, and a cell viability of 100% is also successfully achieved, allowing us to verify that minimally invasive surgery can be carried out. The second advancement is to insert the tip apex of an AFM probe coated with silver (Ag) nanoparticles into a living cell. We have thus succeeded in acquiring a sensitive Raman spectrum originating in protein, DNA, lipids, etc. (Tip-Enhanced Raman Spectroscopy, TERS). By this method, a difference in the ratio of biomolecules between a cell's nucleus and cytoplasm was visualized as information inside a cell, and it was found that there is an inverse correlation (a phenomenon that as one increases, the other decreases) between proteins and glycogen (also called animal starch) as temporal changes in biomolecules inside cells.

1) Atomic Force Microscopy (AFM) is a microscope that detects the atomic force affecting the tip apex and the surface of a sample and was invented by Dr. Gerd Binning and others at IBM Zurich Laboratories in 1985. AFM is a strong tool that can directly observe atomic and molecular images and also evaluate mechanical properties such as frictional force and hardness and electric, magnetic, and thermal properties with nanoscale spatial resolution, becoming a fundamental technology leading today's nanotechnology. Furthermore, AFM can make observations not only in the atmosphere but also in liquids, and thus has been actively applied in the life science and biotechnology fields.

Future Outline

In order to simultaneously achieve nanofabrication and nanomeasurement functions, we will establish a tip-enhanced Raman spectroscopic (TERS) function by coating the surface of a TiO2-functionalized AFM probe with Ag nanoparticles in the future. This function will be able to visualize the process of degradation reactions of organic substances based on photocatalytic oxidation (changes in molecular structures) during the cell surgery process. We will also aim to achieve a means for measuring a single molecule in a target cell membrane protein using the high molecular recognition ability of an antigen-antibody reaction, and we will aim to establish a technique for selective nanofabrication for a single molecule in the target membrane protein identified by the above means. It is expected that this proposed technique could solve the mechanisms of life functions and be applied to work such as the development of novel medicines.

Credit: 
Toyohashi University of Technology (TUT)

Science sweetens native honey health claims

image: The brood sizes of native stingless bees are smaller than honey bees.

Image: 
(c) Tobias Smith UQ

Science has validated Indigenous wisdom by identifying a rare, healthy sugar in native stingless bee honey that is not found in any other food.

University of Queensland organic chemist Associate Professor Mary Fletcher said Indigenous peoples had long known that native stingless bee honey had special health properties.

"We tested honey from two Australian native stingless bee species, two in Malaysia and one in Brazil and found that up to 85 per cent of their sugar is trehalulose, not maltose as previously thought," she said.

Dr Fletcher said trehalulose was a rare sugar with a low glycaemic index (GI), and not found as a major component in any other foods.

"Traditionally it has been thought that stingless bee honey was good for diabetes and now we know why - having a lower GI means it takes longer for the sugar to be absorbed into the blood stream, so there is not a spike in glucose that you get from other sugars," Dr Fletcher said.

"Interestingly trehalulose is also acariogenic, which means it doesn't cause tooth decay."

Dr Fletcher said the findings would strengthen the stingless bee honey market and create new opportunities.

"Stingless bee honey sells now for around AUD $200 per kilogram, which is up there with the price of Manuka and Royal Jelly honey," she said.

"The high commercial value also makes it a risk for substitution, where people could sell other honey as stingless bee honey, or dilute the product.

"But due to this research, we can test for this novel sugar, which will help industry to set a food standard for stingless bee honey.

"People have patented ways of making trehalulose synthetically with enzymes and bacteria, but our research shows stingless bee honey can be used as a wholefood on its own or in other food to get the same health benefits."

The work of Dr Fletcher and the research team has led to a new project funded by AgriFutures Australia and supported by the Australian Native Bee Association.

Working with Dr Natasha Hungerford from UQ's Queensland Alliance for Agriculture and Food Innovation and Dr Tobias Smith from the School of Biological Sciences the new project will investigate storage and collection, to optimise the trehalulose content of Australian stingless bee honey.

Stingless bees (Meliponini) occur in most tropical and sub-tropical regions, with more than 500 species across Neotropical, Afrotropical and Indo-Australian regions.

Like the well-known Apis mellifera honeybees, stingless bees live in permanent colonies made up of a single queen and workers, who collect pollen and nectar to feed larvae within the colony.

Dr Fletcher said keeping native stingless bees was gaining in popularity in Australia, for their role as pollinators as well as for their unique honey.

As well as having health benefits, stingless bee honey is valued for its flavour and is in high demand from chefs.

Credit: 
University of Queensland

Climate shift, forest loss and fires -- Scientists explain how Amazon forest is trapped in a vicious circle

image: Accumulated forest loss during 2001-2017 and fire regime change during the transition season (May-July)

Image: 
Xiyan Xu

Amazonia experiences seasonal fires each year. More than 90% of the fires are distributed across the southern boundary of Amazon Basin where the vegetation dominated by savannas are flammable in the dry season. In recent decades, however, more fires have been reported in the Amazon forests. The 2019 Amazon fires surged to a record high.

A new study, published in Global Change Biology, showed how the fire expansion is attributed to climate regime shift and forest loss. The study was led by scientists from the Institute of Atmospheric Physics (IAP) of the Chinese Academy of Sciences.

"I have not been to the Amazon forest but I took a photo of it from my window seat when I was in a plane flying over the Amazon in 2018 fire season. It pained me to think such intense greenness and freshness might have been scorched," said Prof. Gensuo Jia from IAP, one of the authors of the study.

Global climate change and local deforestation have been blamed as main drivers behind fire intensification. "However, mechanisms and interactive effects are largely ignored and not understood," said Jia.

According to this study, the Amazon fire regime has been expanding from the flammable savannas to moist tropical forests and the fire season was initialized much earlier than two decades ago.

"The fire expansion is a result of more extreme climate events which made the forest more vulnerable. Intensive forest loss which warmed and dried the lower atmosphere therefore increased fire susceptibility," said Xiyan Xu, the first author of the study, "fire exacerbates forest loss and results in a vicious cycle."

Fire burning data derived from satellites observation indicated more fires occurring along the "Arc of Deforestation", a curve at the southeastern edge of the forest where deforestation is the most rapid. In the study, they used multiple satellite data products and climate reanalysis to ensure consistency and reliability.

The Amazon forest is getting dryer and more fire susceptible due to coupled changes of climate seasonality, forest loss, and wildfire. Such positive feedback greatly undermines the sustainability of Amazon region.

"Climate change mitigation and sustainable land management are key to avoid or at least postpone the 'tipping point' of the Amazon forest," said Xu. The "tipping point" is a threshold when the forest loss causes an abrupt or irreversible change in parts of the Earth system.

Credit: 
Institute of Atmospheric Physics, Chinese Academy of Sciences