Tech

Grainger engineers voice localization techniques for smart speakers

image: An example of how an Amazon Alexa could determine a person's location within a home.

Image: 
Romit Roy Chowdhury

Smart speakers - think, an Amazon Alexa or a Google Home - offer a wide variety of capabilities to help free up both our time and our hands. We can hear the morning news while brushing our teeth, ask for a weather report while picking out a coat, and set a timer for the oven while handling two hot pans at once. According to Voicebot.ai, Alexa is supporting more than 100,000 skills worldwide, but one task it hasn't mastered is determining user location in the home.

This localization task was the focus of a University of Illinois at Urbana-Champaign research team's recently published paper, "Voice Localization Using Nearby Wall Reflections." The work was accepted to the 26th Annual International Conference on Mobile Computing and Networking. In the paper, the team - led by Coordinated Science Lab graduate student Sheng Shen -- explores the development of VoLoc, a system that uses the microphone array on Alexa, as well as room echoes of the human voice, to infer the user location inside the home.

Knowing a user's location within a home could help a smart device better support currently available skills. For instance, after receiving commands like "turn on the light" or "increase the temperature," Alexa currently has to guess which light and room is at the heart of the command. Using a technique known as reverse triangulation, Shen and advisor Romit Roy Choudhury are getting closer to voice localization.

"Applying this technique to smart speakers entails quite a few challenges," shared Shen, an electrical and computer engineering (ECE) student. "First, we must separate the direct human voice and each of the room echoes from the microphone recording. Then, we must accurately compute the direction for each of these echoes. Both challenges are difficult because the microphones simply record a mixture of all the sounds altogether."

VoLoc addresses these obstacles through an "align-and-cancel algorithm" that iteratively isolates the directions of each of the arriving voice signals, and from them, reverse triangulates the user's location. Some aspects of the room's geometry is spontaneously learned, which then helps with the triangulation. While this is an important breakthrough, Shen and Roy Choudhury plan to expand the research to more applications soon.

"Our immediate next step is to build to the smart speaker's frame of reference," Shen explained. "This could mean superimposing the locations, as provided by VoLoc, on a floorplan to determine that the user is in the laundry room. Alternatively, if the smart speaker picks up the sounds made by the washer and dryer in the same location as the voice command, it can come to the same conclusion."

The possibilities of this function are seemingly endless and could improve Alexa's current abilities.

"The implications are important," said Roy Choudhury, a CSL professor and the W.J. "Jerry" Sanders III - Advanced Micro Devices, Inc. Scholar in Electrical and Computer Engineering. "Location can help Alexa in improving speech recognition, since different speech vocabularies and models can be loaded. For example, a command like 'add urgent to the shopping list' may not make sense, but if Alexa knows that the user is in the laundry room, Alexa may be able to infer that the user actually said `add detergent to the shopping list'."

Shen and Roy Choudhury acknowledge that the technology could further erode privacy, by allowing companies like Amazon and Google to peer more closely into our homes and daily lives. However, they also believe the benefits are vital, as context-aware smart devices could become crucial supporting technologies to senior independent living and more.

For example, the technology could be used to remind a grandparent who lives independently to take their medication when he or she passes the medicine cabinet, or to remind a child to turn off the faucet when they run out of the bathroom with it still running.

"It's more than interpreting voice commands," said Shen. "It provides an extra set of eyes when it comes to caring for loved ones as well."

Credit: 
University of Illinois Grainger College of Engineering

People want more compensation, security for their genomic data

ITHACA, N.Y. - Genomic data donated by the public is valuable for the companies that collect it. But a recent survey finds that once people are aware of issues surrounding the use and security of genetic information, people are more concerned about how it might be used and expect to be better compensated for providing it.

Research co-authored by Ifeoma Ajunwa, assistant professor of labor relations, law and history at Cornell University, details the results of the first nationally representative survey to consider DNA collection from both nonprofit and for-profit settings.

Ajunwa and Forrest Briscoe, professor of management and organization at Pennsylvania State University, co-authored "Evolving Public Views on the Value of One's DNA and Expectations for Genomic Database Governance: Results from a National Survey," published March 11 in PLOS One.

"A common theme in our findings is how people value control," Briscoe said. "They are more willing to provide data if they are told they will be able to select how it's used in the future - and if they can have it removed from the database later on. What they don't want is total loss of control once it's out there."

In their study, the authors provided 2,020 survey respondents with a three-minute video created from mainstream media coverage of genomic databases. Then they asked a series of questions about how governance policies - the ways data held by these companies is secured, used and regulated - would impact the respondents' willingness to provide data, as well as the payment they expect to receive.

After watching the video, nearly 12% of respondents said they would provide their data for free; more than 50% said they would provide it if compensated with payment and nearly 39% said they wouldn't provide it even if payment was available. This contrasted with results reported by academic research biobanks, which find consistently higher rates of willingness to donate DNA.

"When people were more informed, they were a lot more interested in requiring greater security for their data, and they were a little bit more hesitant to give it up," said Ajunwa.

The survey also asked how 12 specific policies would affect respondents' willingness to provide genomic data. The three policies that made them most likely to provide it were: the ability to request their data be deleted; assurance that their data wouldn't be sold or shared; and requiring specific permissions to reuse the data.

They were least likely to want to provide their data if the company sold access to pharmaceutical firms, provided data to the federal government or retained the data indefinitely.

"People need to know the full worth of their genetic data in order to make an informed consent," Ajunwa said. "How much is the data worth, what kinds of safeguarding are necessary, is it OK to have something in digital form and therefore more vulnerable? There are all of these outstanding questions to be answered."

Credit: 
Cornell University

Bumblebees' aversion to pumpkin pollen may help plants thrive

image: A bumblebee on a leaf leaves a trail of cucurbit pollen behind her as she tries to remove it.

Image: 
Kristen Brochu, Cornell University

ITHACA, N.Y. - Cornell University researchers have found that squash and pumpkin pollen have physical, nutritional and chemical defense qualities that are harmful to bumblebees. The results of their recent study suggest that deterring bumblebees from collecting and eating pollen may provide an evolutionary benefit to cucurbit plants.

"When bumblebees are fed cucurbit pollen, it causes all kinds of problems," said Bryan Danforth, professor of entomology and the paper's senior author. "Adults have damaged and distorted digestive tracts and colonies fed cucurbit pollen failed to rear any offspring."

Bumblebees do visit pumpkin and squash flowers for the nectar, and though they don't collect the pollen, some might inadvertently get on their legs. However, these bees that visit plants for nectar but don't collect pollen may be good pollinators, as stray pollen on their bodies may end up pollinating the next flower.

"I actually saw them in the field using their legs to groom it off their bodies and then wipe it on a leaf," said first author Kristen Brochu, a former Cornell doctoral student in Danforth's lab and a postdoctoral researcher at Pennsylvania State University. "Not only are they not collecting it, they actually hate it."

In the study, Brochu and colleagues created diets that represented different defenses to test which cucurbit pollen characteristics deterred bumblebees. Microcolonies of five bees were each fed a separate treatment. The bees fed wildflower pollen thrived, as expected. Under a natural cucurbit diet, the cumulative effect of the pollen's physical defenses, poor nutritional content and chemicals led to bees ejecting their offspring from their brood cells and killing them. Bumblebees do this when stressed, possibly because they can't take care of the larvae, Brochu said.

In a third treatment, the team extracted the chemicals from the cucurbit pollen and added them to the control diet of nutritionally rich wildflower pollen. With the crushed pollen diet, where the pollen's physical defenses were removed, eggs and larvae failed to mature. "Over the course of the 50 days of the experiment, in both the crushed and natural cucurbit treatment, no offspring made it to adulthood," Brochu said.

In the crushed treatment, the adults also died at a higher rate, possibly due to a release of additional toxic chemicals.

For the sake of bumblebees, she said, pumpkin and squash growers may think twice about bringing commercial bumblebees into their fields and may provide wildflower strips as alternative food sources.

Credit: 
Cornell University

Rural Hondurans embrace cancer screening opportunities

image: Women in rural Honduras line up to get screened for cancer in a community-based screening event designed and conducted by Dartmouth and Honduran researchers.

Image: 
Kathleen Lyons, ScD

LEBANON, NH - The burden of cancer is on the rise in low- and middle-income countries (LMICs) such as Honduras. Few people in rural Honduras have access to cancer screening of any kind. A research team comprised of Honduran oncologists and scientists from Dartmouth's and Dartmouth-Hitchcock's Norris Cotton Cancer Center wanted to test a new model of "multiphasic" cancer screening event that offered testing for multiple types of cancer in a single screening experience. The team chose a rural area in mountainous Honduras and engaged local community leaders in identifying potential barriers to participation. Together, they developed simple strategies to mitigate the barriers and potentially maximize participation.

"Each of three screening events attracted people from more than 30 different rural communities. Over the three events, 1,175 participants were screened and 190 received recommendations for follow-up testing within the urban cancer center. Of those 190, 88% of them adhered to the recommendation and sought follow-up care," reports Kathleen Lyons, ScD, Senior Scientist. "And even though there is a recognized 'men's health gap' worldwide, particularly in LMICs, the local community asked us to screen men while at the same time counseling us that the men might not attend. Actually, they did participate, and it was gratifying to see that almost all who were identified as at-risk complied with referral for clinical follow-up."

The international research team called CLARO, which stands for Community-Led Action Research in Oncology, is active in engaging community leaders to identify local Honduran priorities and challenges in oncology. Together, CLARO generates new knowledge to devise cancer prevention strategies that could be deployed in any rural setting around the world. "For example, we scheduled the screening events on weekend days, close to home, arranged transportation, fed participants a hot meal, and used local champions to issue personal invitations," says Lyons. "At each event, participants were screened for two to five different types of cancer to maximize what could be their only screening opportunity, ever."

CLARO's findings, "Feasibility of Brigade-Style, Multiphasic Cancer Screening in Rural Honduras" are newly published in ASCO's Journal of Clinical Oncology - Global Oncology under senior author, Honduran oncologist, Suyapa Bejarano, MD.

From the beginning, CLARO designed a screening approach that can be done anywhere that people can come together. Most of the onsite volunteers were Honduran medical students, but similar medical students are available in most every country. "Involving them in cancer screening builds capacity long-term in their future medical practices where they can utilize many of the techniques that they learn," notes Lyons. The spaces the team used are also like those everywhere: "Small concrete schools with simple cubicles made by the men of the community. Headlamps eliminate the need for electricity and our technique for tracking medical records onsite involves paper, pencil, and zip-top plastic bags."

The team is developing a "toolkit" of their process that can guide other communities through the collaborative process of designing large-scale screening events that can be customized to meet individual community needs. Dartmouth's long-term partners in Dar es Salaam Tanzania at the Ocean Road Cancer Institute are taking a different approach by screening at workplaces. "We look forward to collaborating with them to build our knowledge of what works across multiple geographic and cultural differences," says Lyons.

Curbing the rise of cancer in LMICs with prevention and early detection via community-based cancer screening is a strategy proven to be feasible to implement and welcomed by local people.

Credit: 
Dartmouth Health

Magic twist angles of graphene sheets identified

image: Soumendu Bagchi

Image: 
Department of Aerospace Engineering, Grainger Engineering

Graphene is 200 times stronger than steel and can be as much as 6 times lighter. These characteristics alone make it a popular material in manufacturing. Researchers at the University of Illinois at Urbana-Champaign recently uncovered more properties of graphene sheets that can benefit industry.

Doctoral student Soumendu Bagchi, along with his adviser Huck Beng Chew in the Department of Aerospace Engineering in collaboration with Harley Johnson from Mechanical Sciences and Engineering identified how twisted graphene sheets behave and their stability at different sizes and temperatures.

"We concentrated on two graphene sheets stacked on top of each other but with a twist angle," said Bagchi. "We did atomistic simulations at different temperatures for different sizes of graphene sheets. Using insights from these simulations, we developed an analytical model--you can plug in any sheet size, any twist angle, and the model will predict the number of local stable states it has as well as the critical temperature required to reach each of those states."

Bagchi explained that bilayer graphene exists in an untwisted Bernal-stacked configuration, which is also the repeated stacking sequence of crystalline hexagonal graphite. When bilayer graphene is twisted, it wants to untwist back to its original state because that's the most stable state and placement of the atoms.

"When the twisted atomic structure is heated, it tends to rotate back, but there are certain magic twist angles at which the structure remains stable below a specific temperature. And, there is a size dependency as well. What's exciting about our work is, depending upon the size of the graphene sheet, we can predict how many stable states you will have, the magic twist angles at these stable states, as well as the range of temperatures required for twisted graphene to transition from one stable state to another," Bagchi said.

According to Chew, manufacturers have been trying to make graphene transistors, and twisted graphene bilayers are known to exhibit exciting electronic properties. In manufacturing these graphene transistors, it's important to know what temperature will excite the material to achieve a certain rotation or mechanical response.

"They've known that a graphene sheet has certain electronic properties, and adding a second sheet at an angle yields new unique properties. But a single atomic sheet is not easy to manipulate. Fundamentally, this study answers questions about how twisted graphene sheets behave under thermal loading, and provides insights into the self-alignment mechanisms and forces at the atomic level. This could potentially pave the way for manufacturers to achieve fine control over the twist angle of 2D material structures. They can directly plug in parameters into the model to understand the necessary conditions required to achieve a specific twisted state."

Bagchi said no one has studied the 2D properties of materials like this. It is a very fundamental study, and one that began as a different project, when he bumped into something unusual.

"He noticed that the graphene sheets showed some temperature dependence," Chew said. "We wondered why it behaved this way--not like a normal material.

"In normal materials, the interface is typically very strong. With graphene, the interface is very weak allowing the layers to slide and rotate. Observing this interesting temperature dependency wasn't planned. This is the beauty of discovery in science."

Credit: 
University of Illinois Grainger College of Engineering

Understanding how monomer sequence affects conductance in 'molecular wires'

image: Researchers in the Schroeder and Moore groups at the University of Illinois are interested in building and studying chain molecules with high levels of precision. Pictured from left, Hao Yu, graduate student in chemical and biomolecular engineering; Jeff Moore, professor of chemistry; Charles Schroeder, professor of chemical and biomolecular engineering; and Songsong Li, graduate student in materials science and engineering.

Image: 
Doris Dahl, Beckman Institute, University of Illinois at Urbana-Champaign

Researchers in the Schroeder and Moore groups at the University of Illinois at Urbana-Champaign have published a new study that illustrates how changes in the polymer sequence affect charge transport properties. This work required the ability to build and study chain molecules with high levels of precision.

The paper, "Charge Transport in Sequence-Defined Conjugated Oligomers," was published in the Journal of the American Chemical Society.

Chain molecules or polymers are ubiquitous in modern society, with organic electronic materials increasingly used in solar cells, flat panel displays, and sensors. However, conventional materials are generally made by statistical polymerization, where the order of the subunits or monomers -- the monomer sequence -- is random.

"Traditional polymerization methods do not give us a perfect level of control of sequence," said Charles Schroeder, the associate head and Ray and Beverly Mentzer Professor in Chemical and Biomolecular Engineering and a full-time faculty member at the Beckman Institute for Advanced Science and Technology. "As a result, it has been challenging to ask how the monomer sequence affects its properties."

The researchers developed a method called iterative synthesis to deal with the problem. "Protein synthesis in our cells occurs by adding the amino acids one by one. We use the same method for making synthetic polymers where we add distinct monomers in a one-by-one fashion. This allows us to precisely control the sequence in a linear arrangement," said Hao Yu, a graduate student in the Schroeder Group, and the Moore Group led by Jeff Moore, the Stanley O. Ikenberry Endowed Chair and professor of chemistry.

After making the materials, the researchers studied their charge transport properties using single molecule techniques. In this way, they were able to measure the conductance through single chains, much like a 'molecular wire.'

"Molecular wires are generally good at transporting charge," Schroeder said. "We wanted to know how the charge transport properties change if the overall sequence changes."

Yu added molecular anchors at both ends of the chain molecule to enable the characterization. "We used a technique called the scanning tunneling microscope-break junction method, where the anchors link to two gold electrodes and form a molecular junction," said Songsong Li, a graduate student in the Schroeder Group. "Then we impose an applied bias or voltage across the molecule, and this allows us to measure the charge transport properties of these polymers."

"Currently the synthesis method is labor intensive," Schroeder said. "Moving forward, we are developing automated synthesis methods in the Beckman Institute to generate large libraries of sequence-defined molecules."

"The implications of this work are significant," said Dawanne Poree, program manager at the Army Research Office that supports the work. "It's often been wondered if the sequence-dependent properties observed in biological polymers could translate to synthetic polymeric materials. This work represents a step toward answering this question. Additionally, this work provides key insights into how molecular structure can be rationally designed and manipulated to render materials with designer properties of interest to the Army such as nanoelectronics, energy transport, molecular encoding, and data storage, self-healing, and more."

Credit: 
Beckman Institute for Advanced Science and Technology

Survey shows emergency physicians may benefit from training on safely handling firearms

image: Anonymous online survey of 1,074 employees of U.S. acute care services.

Image: 
KIRSTY CHALLEN, B.SC., MBCHB, MRES, PH.D., LANCASHIRE TEACHING HOSPITALS, UNITED KINGDOM

DES PLAINES, IL -- Emergency physicians may benefit from training on safely handling firearms, according to the findings of a survey to be published in the March 2020 issue of Academic Emergency Medicine (AEM), a journal of the Society for Academic Emergency Medicine (SAEM).

The lead author of the study is Andrew Ketterer, MD, MA, assistant program director, Harvard Affiliated Emergency Medicine Residency, Beth Israel Deaconess Medical Center. The findings of the study are discussed in a recent AEM podcast: Emergency Providers' Familiarity with Firearms: A National Survey and further highlighted in commentary by AEM Editor-in-Chief Jeffery A. Kline, MD:

Ketterer et al. found that the majority of survey respondents (more than one-half of the 1,074 surveyed) encountered firearms at work at least once per year, but under one-half of all respondents felt confident in their ability to deal with a firearm found on a patient. Firearm experience by the respondent corresponded with feelings of high self-confidence in this ability and was associated with comfort in managing firearms found in patients' possession. This finding suggests that emergency physicians may benefit from educational interventions targeting firearms safety.

Commenting on the study is Dr. Megan L. Ranney, an associate professor in the department of emergency medicine at Brown University and chief research officer for AFFIRM Research:

"This study reinforces what many of us know anecdotally: most emergency physicians have encountered a firearm at work in the last year, and lack of confidence in how to handle the firearm correlates with our feeling unsafe. As we begin to develop approaches to firearm injury prevention as a health problem, it's essential for us to include the voices of firearm owners and trainers, to improve our comfort with handling these challenging clinical situations."

The survey represents the largest study to date detailing the personal experience of emergency physicians with the safe handling of firearms.

Credit: 
Society for Academic Emergency Medicine

World's first experimental observation of a Kondo cloud

image: This is a schematic illustration of the Kondo cloud detection.

Image: 
Jeongmin Shim

Physicists have been trying to observe the quantum phenomenon Kondo cloud for many decades. An international research team has recently developed a novel device which successfully measures the length of the Kondo cloud and even allows for controlling the Kondo cloud. The findings can be regarded as a milestone in condensed matter physics, and may provide insights for understanding the multiple impurity systems, such as high-temperature superconductors.

This breakthrough was achieved by a team of researchers from City University of Hong Kong (CityU), RIKEN Center for Emergent Matter Science (CEMS), the University of Tokyo, Korean Institute of Science and Technology (KAIST), and Ruhr-University Bochum. Their research findings were published in the latest issue of the highly prestigious scientific journal Nature, titled “Observation of the Kondo Screening Cloud”.

What is the Kondo cloud?

Kondo effect is a physical phenomenon discovered in the 1930s. In metals, as the temperature drops, electrical resistance usually drops. However, if there are some magnetic impurities in the metal, it will show the opposite result. Resistance will drop at first. But when it is below some threshold temperature, the resistance will increase as the temperature decreases further.

This puzzle was solved by Jun Kondo, a Japanese theoretical physicist over 50 years ago and hence the effect was named after him. He explained that when a magnetic atom (an impurity) is placed inside a metal, it has a spin. But instead of just coupling with one electron to form a pair of spin-up and spin-down, it couples collectively with all the electrons within some areas around it, forming a cloud of electrons surrounding the impurity. And this is called the Kondo cloud. So when a voltage is applied over it, the electrons are not free to move or are screened off by the Kondo cloud, resulting in resistance increase.

How big is the cloud?

Some basic properties of the Kondo effect have been proved experimentally and were found related to the Kondo temperature (the threshold temperature where the resistance starts to go up at low temperature). However, the measurement of Kondo cloud’s length was yet to be achieved. Professor Heung-Sun Sim of KAIST, the theorist who proposed the method for detecting the Kondo cloud, commented that “the observed spin cloud is a micrometer-size object that has quantum mechanical wave nature and entanglement. This is why the spin cloud has not been observed despite a long search.”

Dr Ivan Borzenets of City University of Hong Kong, who performed the experimental measurement of this project further elaborated it. “The difficulty in detecting the Kondo cloud lies in the fact that measuring spin correlation in the Kondo effect requires the fast detection of tens of gigahertz. And you cannot freeze time to observe and measure each of the individual electrons,” he said.

Isolating a single Kondo cloud in the device

Thanks to the advance in nanotechnology, the research team fabricated a device which can confine an unpaired electron spin (magnetic impurity) in a quantum dot, like a small conducting island with a diameter of only a few hundreds nanometres. "Since the quantum dot is very small, you can know exactly where the impurity is," said Dr Borzenets.

Connecting to the quantum dot is a one-dimensional and long channel. The unpaired electron is constricted to couple to the electrons in this channel and form a Kondo cloud there."In this way, we isolate a single Kondo cloud around a single impurity, and we can control the size of the cloud as well," he explained.

The novelty of the system is that by applying a voltage at different points inside the channel with various distances away from the quantum dot, they induced "weak barriers" along the channel. Researchers then observed the resulting change in electron flow and the Kondo effect with varying barrier strength and position.

The secret lies in the oscillation amplitude

By changing the voltages, it was found that the conductance went up and down, no matter where they put the barriers at. And when there were oscillations in conductance, oscillations in the measured Kondo temperature were observed.

When the researchers plotted the oscillation amplitude of Kondo temperature versus the barrier distance from the impurity divided by the theoretical cloud length, they found that all their data points fall onto a single curve, as theoretically expected. "We have experimentally confirmed the original theoretical result of the Kondo cloud length which is in micrometre scale," said Dr Borzenets. "For the first time, we have proved the existence of the cloud by directly measuring the Kondo cloud length. And we found out the proportionality factor connecting the size of the Kondo cloud and Kondo temperature."

Provide insights into multiple impurity systems

The team spent almost three years in this research. Their next step is to investigate different ways to control the Kondo state. According to Dr Michihisa Yamamoto of the RIKEN CEMS, who led the international collaboration, “it is very satisfying to have been able to obtain real space image of the Kondo cloud, as it is a real breakthrough for understanding various systems containing multiple magnetic impurities. This achievement was only made possible by close collaboration with theorists.”

The size of the Kondo cloud in semiconductors was found to be much larger than the typical size of semiconductor devices. This means that the cloud can mediate interactions between distant spins confined in quantum dots, which is a necessary protocol for semiconductor spin-based quantum information processing. This spin-spin interaction mediated by the Kondo cloud is unique since both its strength and sign (two spins favor either parallel or anti-parallel configuration) are electrically tunable, while conventional schemes cannot reverse the sign. This opens up a novel way to engineer spin screening and entanglement,” Dr Yamamoto explained.

“It is remarkable in a fundamental and technical point of view that such a large quantum object can now be created, controlled, and detected,” Professor Heung-Sun Sim concluded.

Credit: 
City University of Hong Kong

Discovery of smallest known mesozoic dinosaur reveals new species in bird evolution

image: A seemingly mature skull specimen preserved in Burmese amber reveals a new species, Oculudentavis khaungraae, that could represent the smallest known Mesozoic dinosaur in the fossil record.

Image: 
Xing Lida

CLAREMONT, CA (11 March 2020, 9 a.m. PDT)--The discovery of a small, bird-like skull, described in an article published in Nature, reveals a new species, Oculudentavis khaungraae, that could represent the smallest known Mesozoic dinosaur in the fossil record.

While working on fossils from in northern Myanmar, Lars Schmitz, associate professor of biology at the W.M. Keck Science Department, and a team of international researchers discovered a seemingly mature skull specimen preserved in Burmese amber. The specimen's size is on par with that of the bee hummingbird, the smallest living bird.

"Amber preservation of vertebrates is rare, and this provides us a window into the world of dinosaurs at the lowest end of the body-size spectrum," Schmitz said. "Its unique anatomical features point to one of the smallest and most ancient birds ever found."

The team studied the specimen's distinct features with high-resolution synchrotron scans to determine how the skull of the Oculudentavis khaungraae differs from those of other bird-like dinosaur specimens of the era. They found that the shape and size of the eye bones suggested a diurnal lifestyle, but also revealed surprising similarities to the eyes of modern lizards. The skull also shows a unique pattern of fusion between different bone elements, as well as the presence of teeth. The researchers concluded that the specimen's tiny size and unusual form suggests a never-before-seen combination of features.

The discovery represents a specimen previously missing from the fossil record and provides new implications for understanding the evolution of birds, demonstrating the extreme miniaturization of avian body sizes early in the evolutionary process. The specimen's preservation also highlights amber deposits' potential to reveal the lowest limits of vertebrate body size.

"No other group of living birds features species with similarly small crania in adults," Schmitz said. "This discovery shows us that we have only a small glimpse of what tiny vertebrates looked like in the age of the dinosaurs."

Credit: 
Scripps College

ESO telescope observes exoplanet where it rains iron

image: This illustration shows a night-side view of the exoplanet WASP-76b. The ultra-hot giant exoplanet has a day side where temperatures climb above 2400 degrees Celsius, high enough to vaporise metals. Strong winds carry iron vapour to the cooler night side where it condenses into iron droplets. To the left of the image, we see the evening border of the exoplanet, where it transitions from day to night.

Image: 
ESO/M. Kornmesser

Researchers using ESO's Very Large Telescope (VLT) have observed an extreme planet where they suspect it rains iron. The ultra-hot giant exoplanet has a day side where temperatures climb above 2400 degrees Celsius, high enough to vaporise metals. Strong winds carry iron vapour to the cooler night side where it condenses into iron droplets.

"One could say that this planet gets rainy in the evening, except it rains iron," says David Ehrenreich, a professor at the University of Geneva in Switzerland. He led a study, published today in the journal Nature, of this exotic exoplanet. Known as WASP-76b, it is located some 640 light-years away in the constellation of Pisces.

This strange phenomenon happens because the 'iron rain' planet only ever shows one face, its day side, to its parent star, its cooler night side remaining in perpetual darkness. Like the Moon on its orbit around the Earth, WASP-76b is 'tidally locked': it takes as long to rotate around its axis as it does to go around the star.

On its day side, it receives thousands of times more radiation from its parent star than the Earth does from the Sun. It's so hot that molecules separate into atoms, and metals like iron evaporate into the atmosphere. The extreme temperature difference between the day and night sides results in vigorous winds that bring the iron vapour from the ultra-hot day side to the cooler night side, where temperatures decrease to around 1500 degrees Celsius.

Not only does WASP-76b have different day-night temperatures, it also has distinct day-night chemistry, according to the new study. Using the new ESPRESSO instrument on ESO's VLT in the Chilean Atacama Desert, the astronomers identified for the first time chemical variations on an ultra-hot gas giant planet. They detected a strong signature of iron vapour at the evening border that separates the planet's day side from its night side. "Surprisingly, however, we do not see the iron vapour in the morning," says Ehrenreich. The reason, he says, is that "it is raining iron on the night side of this extreme exoplanet."

"The observations show that iron vapour is abundant in the atmosphere of the hot day side of WASP-76b," adds María Rosa Zapatero Osorio, an astrophysicist at the Centre for Astrobiology in Madrid, Spain, and the chair of the ESPRESSO science team. "A fraction of this iron is injected into the night side owing to the planet's rotation and atmospheric winds. There, the iron encounters much cooler environments, condenses and rains down."

This result was obtained from the very first science observations done with ESPRESSO, in September 2018, by the scientific consortium who built the instrument: a team from Portugal, Italy, Switzerland, Spain and ESO.

ESPRESSO -- the Echelle SPectrograph for Rocky Exoplanets and Stable Spectroscopic Observations -- was originally designed to hunt for Earth-like planets around Sun-like stars. However, it has proven to be much more versatile. "We soon realised that the remarkable collecting power of the VLT and the extreme stability of ESPRESSO made it a prime machine to study exoplanet atmospheres," says Pedro Figueira, ESPRESSO instrument scientist at ESO in Chile.

"What we have now is a whole new way to trace the climate of the most extreme exoplanets," concludes Ehrenreich.

Credit: 
ESO

Engineers crack 58-year-old puzzle on way to quantum breakthrough

image: An artist's impression of how a nanometer-scale electrode is used to locally control the quantum state of a single nucleus inside a silicon chip.

Image: 
Picture: UNSW/Tony Melov

A happy accident in the laboratory has led to a breakthrough discovery that not only solved a problem that stood for more than half a century, but has major implications for the development of quantum computers and sensors.
In a study published today in Nature, a team of engineers at UNSW Sydney has done what a celebrated scientist first suggested in 1961 was possible, but has eluded everyone since: controlling the nucleus of a single atom using only electric fields.

"This discovery means that we now have a pathway to build quantum computers using single-atom spins without the need for any oscillating magnetic field for their operation," says UNSW's Scientia Professor of Quantum Engineering Andrea Morello. "Moreover, we can use these nuclei as exquisitely precise sensors of electric and magnetic fields, or to answer fundamental questions in quantum science."

That a nuclear spin can be controlled with electric, instead of magnetic fields, has far-reaching consequences. Generating magnetic fields requires large coils and high currents, while the laws of physics dictate that it is difficult to confine magnetic fields to very small spaces - they tend to have a wide area of influence. Electric fields, on the other hand, can be produced at the tip of a tiny electrode, and they fall off very sharply away from the tip. This will make control of individual atoms placed in nanoelectronic devices much easier.

A NEW PARADIGM

Prof Morello says the discovery shakes up the paradigm of nuclear magnetic resonance, a widely used technique in fields as disparate as medicine, chemistry, or mining.
"Nuclear Magnetic Resonance is one of the most widespread techniques in modern physics, chemistry, and even medicine or mining," he says. "Doctors use it to see inside a patient's body in great detail while mining companies use it to analyse rock samples. This all works extremely well, but for certain applications, the need to use magnetic fields to control and detect the nuclei can be a disadvantage."

Prof Morello uses the analogy of a billiard table to explain the difference between controlling nuclear spins with magnetic and electric fields.

"Performing magnetic resonance is like trying to move a particular ball on a billiard table by lifting and shaking the whole table," he says. "We'll move the intended ball, but we'll also move all the others."

"The breakthrough of electric resonance is like being handed an actual billiards stick to hit the ball exactly where you want it."

Amazingly, Prof Morello was completely unaware that his team had cracked a longstanding problem in finding a way to control nuclear spins with electric fields, first suggested in 1961 by a pioneer of magnetic resonance and Nobel Laureate, Nicolaas Bloembergen.

"I have worked on spin resonance for 20 years of my life, but honestly, I had never heard of this idea of nuclear electric resonance," Prof Morello says. "We 'rediscovered' this effect by complete accident - it would never have occurred to me to look for it. The whole field of nuclear electric resonance has been almost dormant for more than half a century, after the first attempts to demonstrate it proved too challenging."

OUT OF CURIOSITY

The researchers had originally set out to perform nuclear magnetic resonance on a single atom of antimony - an element that possesses a large nuclear spin. One of the lead authors of the work, Dr Serwan Asaad, explains: "Our original goal was to explore the boundary between the quantum world and the classical world, set by the chaotic behaviour of the nuclear spin. This was purely a curiosity-driven project, with no application in mind."

"However, once we started the experiment, we realised that something was wrong. The nucleus behaved very strangely, refusing to respond at certain frequencies, but showing a strong response at others," recalls Dr Vincent Mourik, also a lead author on the paper.

"This puzzled us for a while, until we had a 'eureka moment' and realised that we were doing electric resonance instead of magnetic resonance."

Dr Asaad continued: "What happened is that we fabricated a device containing an antimony atom and a special antenna, optimized to create a high-frequency magnetic field to control the nucleus of the atom. Our experiment demands this magnetic field to be quite strong, so we applied a lot of power to the antenna, and we blew it up!"

GAME ON

"Normally, with smaller nuclei like phosphorus, when you blow up the antenna it's 'game over' and you have to throw away the device," says Dr Mourik.
"But with the antimony nucleus, the experiment continued to work. It turns out that after the damage, the antenna was creating a strong electric field instead of a magnetic field. So we 'rediscovered' nuclear electric resonance."

After demonstrating the ability to control the nucleus with electric fields, the researchers used sophisticated computer modelling to understand how exactly the electric field influences the spin of the nucleus. This effort highlighted that nuclear electric resonance is a truly local, microscopic phenomenon: the electric field distorts the atomic bonds around the nucleus, causing it to reorient itself.

"This landmark result will open up a treasure trove of discoveries and applications," says Prof Morello. "The system we created has enough complexity to study how the classical world we experience every day emerges from the quantum realm. Moreover, we can use its quantum complexity to build sensors of electromagnetic fields with vastly improved sensitivity. And all this, in a simple electronic device made in silicon, controlled with small voltages applied to a metal electrode!"

Credit: 
University of New South Wales

Next gen 911 services are highly vulnerable to cyberattacks -- Ben-Gurion researchers

NORTH CAROLINA...March 11, 2020 - Despite a previous warning by Ben-Gurion University of the Negev (BGU) researchers, who exposed vulnerability in the 911 system due to distributed denial of service attacks (DDoS), the next generation of 911 systems that now accommodate text, images and video still have the same or more severe issues.

In the study published by IEEE Transactions in Dependable and Secure Computing, the BGU researchers evaluated the impact of DDoS attacks on the current (E911) and next generation 911 (NG911) infrastructures in North Carolina. The research was conducted by Dr. Mordechai Guri, head of research and development, BGU Cyber Security Research Center (CSRC), and chief scientist at Morphisec Technologies, and Dr. Yisroel Mirsky, senior cyber security researcher and project manager at the BGU CSRC.

In recent years, organizations have experienced countless DDoS attacks, during which internet-connected devices are flooded with traffic - often generated by many computers or phones called "bots" that are infected by malware by a hacker and act in concert with each other. When an attacker ties up all the available connections with malicious traffic, no legitimate information - like calling 911 in a real emergency - can make it through.

"In this study, we found that only 6,000 bots are sufficient to significantly compromise the availability of a state's 911 services and only 200,000 bots can jeopardize the entire United States," Dr. Guri explains.

When telephone customers dial 911 on their landlines or mobile phones, the telephone companies' systems make the connection to the appropriate call center. Due to the limitations of original E911, the U.S. has been slowly transitioning the older circuit-switched 911 infrastructure to a packet-switched voice over IP (VoIP) infrastructure, NG911. It improves reliability by enabling load balancing between emergency call centers or public safety answering points (PSAP). It also expands 911 service capabilities, enabling the public to call over VoIP, transmit text, images, video, and data to PSAPs. A number of states have implemented this and nearly all other states have begun planning or have some localized implementation of NG911.

Many internet companies have taken significant steps to safeguard against this sort of online attack. For example, Google Shield is a service that protects news sites from attacks by using Google's massive network of internet servers to filter out attacking traffic, while allowing through only legitimate connections. However, phone companies have not done the same.

To demonstrate how DDoS attacks could affect 911 call systems, the researchers created a detailed simulation of North Carolina's 911 infrastructure, and a general simulation of the entire U.S. emergency-call system. Using only 6,000 infected phones, it is possible to effectively block 911 calls from 20% of the state's landline callers, and half of the mobile customers. "In our simulation, even people who called back four or five times would not be able to reach a 911 operator to get help," Dr. Guri says.

The countermeasures that exist today are difficult and not without flaws. Many involve blocking certain devices from calling 911, which carries the risk of preventing a legitimate call for help. But they indicate areas where further inquiry - and collaboration between researchers, telecommunications companies, regulators, and emergency personnel - could yield useful breakthroughs.

For example, cellphones might be required to run a monitoring software to blacklist or block themselves from making fraudulent 911 calls. Or 911 systems could examine identifying information of incoming calls and prioritize those made from phones that are not trying to mask themselves.

"Many say that the new NG911 solves the DDoS problem because callers can be connected to PSAPs around the country, not just locally," Dr. Mirsky explains. "Nationally, with complete resource sharing, the rate that callers give up trying -- called the 'despair rate' -- is still significant: 15% with 6,000 bots and 43% with 50,000 bots.

"But the system would still need to communicate locally to dispatch police, medical and fire services. As a result, the despair rate is more likely to be 56% with 6,000 bots --worse than using the original E911 infrastructure."

According to Dr. Guri, "We believe that this research will assist the respective organizations, lawmakers and security professionals in understanding the scope of this issue and aid in the prevention of possible future attacks on the 911 emergency services. It is critical that 911 services always be available - to respond quickly to emergencies and give the public peace of mind."

Credit: 
American Associates, Ben-Gurion University of the Negev

NASA-NOAA satellite catches development of tropical storm 21S

image: NASA-NOAA's Suomi NPP satellite passed over the Southern Indian Ocean and captured a visible image of newly formed Tropical Storm 21S off the coast of Western Australia.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

NASA-NOAA's Suomi NPP satellite passed over the Southern Indian Ocean and provided forecasters with a visible image of newly formed Tropical Storm 21S.

The Visible Infrared Imaging Radiometer Suite (VIIRS) instrument aboard Suomi NPP provided a visible image of Tropical Cyclone 21S that revealed strong thunderstorms circling the center of circulation. Strong storms were also in the southeastern quadrant of the storm which extended to the coast of Western Australia.

At 0900 UTC, (5 a.m. EDT) Tropical Cyclone 21S was located near latitude 16.4 degrees south and longitude 116.9 degrees east, about 253 nautical miles north-northwest of Port Hedland, Australia. 21S was moving to the south and had maximum sustained winds near 35 knots (40 mph/65 kph). The storm is not expected to strengthen much before it makes landfall in Western Australia.

The Joint Typhoon Warning Center (JTWC) expects 21S to move south-southeast and pass over Exmouth as a tropical storm. Exmouth is a town located on the tip of the North West Cape in Western Australia. JTWC forecasters expect that 21S will dissipate after two days.

Tropical cyclones/hurricanes are the most powerful weather events on Earth. NASA's expertise in space and scientific exploration contributes to essential services provided to the American people by other federal agencies, such as hurricane weather forecasting.

Credit: 
NASA/Goddard Space Flight Center

Natural bayou better when floods threaten Houston

image: Buffalo Bayou in Houston. A Rice University comparison of flood plains around Houston's two major bayous shows the natural Buffalo Bayou is far better at managing floodwaters than the channelized Brays Bayou.

Image: 
Andrew Juan/Rice University

HOUSTON - (March 11, 2020) - One bayou meanders toward downtown Houston. The other runs in parallel to the south, much of it through a concrete channel.

Which is better at preventing floods? Researchers at Rice University's Brown School of Engineering give the nod to nature.

In studying the evolution of flood plains based on Houston's Buffalo and Brays bayous, the researchers associated with Rice's Severe Storm Prediction, Education & Evacuation from Disasters Center determined Buffalo's largely natural form has proven better at absorbing floodwater and preventing it from spilling over into heavily populated areas.

An open-access paper in the Journal of Flood Risk Management details Rice models that show how flood plains have evolved and will evolve over a 70-year span, up to 2040.

The team of Rice research scientist Andrew Juan, former graduate student Avantika Gori and associate research scientist Antonia Sebastian found that while the extent of the 10-year flood plain has remained relatively stable along both bayous, the 100-year flood plain (areas with a 1-in-100 chance of flooding during any given year) shows stark differences.

While the 100-year flood plain model along Buffalo Bayou remained mostly unchanged over the 70-year span, the researchers show recent and potential development along the already densely populated Brays has a significant impact on flooding.

Their models show that urban development in the Brays' watershed, which they expect to grow by 29% from the 1970s through 2040, increased the 100-year flood plain from 10.5 square kilometers in the 1970s to 46.1 square kilometers in 2011. They predict it will rise to 73.3 kilometers in 2040.

"This is one of the major observations," Juan said. "Hydrologists like to talk about streamflow, runoff volume and peak timing, but from a risk standpoint, flood plain extent is one of the performance metrics we should really pay attention to. In this case, we are seeing a huge difference in our model between the two neighboring watersheds."

To some degree, he said, Buffalo Bayou's relative superiority is due to forward-thinking individuals who, starting in the 1960s, protested "channelizing" the bayou as had been done at Brays. That has helped preserve the bayou's natural drainage to contain normally heavy rains that are slower to reach the bayou. Furthermore, banning buildings from the designated setbacks close to the bayou has kept property damage at bay, Juan said.

"It's not just about natural drainage," he said. "It's also about keeping the setbacks from being developed. Removing people from flood-vulnerable areas is effective."

The flood plains near Brays are of an entirely different character and have changed more significantly since 1970 than those along the Buffalo, Juan said.

"When you look at the number of residential parcels within the flood plains, the two are comparable in the 1970s," he said. "However, you see a dramatic increase in 2011 for Brays, but not so much for Buffalo."

The differences are evident at upstream and downstream watch points along Brays, to the west and east of the flood-prone Meyerland neighborhood, according to the study. Measurements at those points show both 10- and 100-year normalized peak flows nearly doubled between the 1970s and 2011, with larger peak flows occurring when heavy rains inundate the neighborhood.

The study only looked at rainwater runoff into the bayous and its impact on flooding. Minor contributions by underground pipelines, which are sufficient to handle 2- to 5-year floods, and the impact of Barker and Addicks reservoirs that were opened upstream of Buffalo Bayou in the aftermath of Hurricane Harvey, were not considered.

Juan said the researchers realize moving structures away from Brays is highly unlikely, but reconfiguring available open spaces and golf courses to serve as retention ponds when needed could help areas at the epicenter of flooding over the past decade, particularly Meyerland.

"The heavily invested Project Brays includes a number of channelization, detention and bridge modification projects," he said. "It's a multimillion-dollar project and it definitely has benefited the watershed as a whole. But on the flip side, it might have contributed to creating a false sense of security. It was supposed to protect the entire watershed from a 100-year flood, but flood damages from recent major storms have shown that it was insufficient.

"Maybe individual homeowners, residents, planners and other stakeholders need to be more proactive in exploring other mitigation strategies," he said.

Sebastian is now an assistant professor of geological sciences at the University of North Carolina at Chapel Hill. The study was supported by the Houston Endowment and the National Science Foundation.

Credit: 
Rice University

Study unveils striking disparities in health outcomes among 2 populations

Boston - In the United States, income inequality has steadily increased over the last several decades. Given widening inequities, health care leaders have been concerned about the health outcomes of older Americans who experience poverty, particularly because prior studies have shown a strong link between socioeconomic status and health.

In a new study published today in JAMA, a team of researchers led by Rishi Wadhera, MD, MPP, MPhil, an investigator in the Smith Center for Outcomes Research in Cardiology at Beth Israel Deaconess Medical Center (BIDMC), evaluated how health outcomes for low-income older adults who are dually enrolled in both Medicare and Medicaid have changed since the early 2000s, and whether disparities have narrowed or widened over time compared with more affluent older adults who are solely enrolled in Medicare.

The study included more than 71 million older adults insured by Medicare from 2004 and 2017. The researchers evaluated the change in annual death rates, hospitalization rates, and hospitalization-related deaths for the subset of this population that is also dually enrolled in Medicaid due to poverty. Wadhera and colleagues also assessed whether health outcomes for this low-income population have improved or worsened compared with more affluent older adults only enrolled in Medicare.

"We found that annual death rates were more than two-fold higher for low-income older adults who were dually enrolled in Medicare and Medicaid compared with their more affluent, non-dually enrolled counterparts," said Wadhera. "In addition, annual hospitalization rates among dually-enrolled adults were almost double that of non-dually enrolled adults, and 30-day and one-year death rates after hospitalization were also higher for this low-income population. Most concerning, disparities in health outcomes between these two groups have not narrowed, and in some cases, are worsening."

Wadhera and colleagues also found that annual death rates for low-income older adults were generally highest in Midwestern states, such as Indiana and Ohio, and the Southeastern regions of the U.S. including Florida, Alabama, Arkansas and the Carolinas. Further research is needed to understand why high death rates persist in these regions of the US.

The findings are particularly striking given that over the last decade, a large part of U.S. policy and public health focus has been on improving the health of vulnerable and marginalized populations. Wadhera and colleagues say the study suggests that greater local and national efforts are needed to reduce health inequities.

"We know that low-income older Americans who are dually-enrolled in Medicare and Medicaid face unique challenges, such as poverty, housing instability, residence in more disadvantaged neighborhoods, and worse access to health care," said Wadhera. "Our findings suggest that in order to improve health equity for this population, public health and policy efforts are needed to directly address social determinants of health and to provide support for safety-net health care systems that tend to care for low-income patients."

Credit: 
Beth Israel Deaconess Medical Center