Tech

Some mobile phone apps may contain hidden behaviors that users never see

COLUMBUS, Ohio - A team of cybersecurity researchers has discovered that a large number of cell phone applications contain hardcoded secrets allowing others to access private data or block content provided by users.

The study's findings: that the apps on mobile phones might have hidden or harmful behaviors about which end users know little to nothing, said Zhiqiang Lin, an associate professor of computer science and engineering at The Ohio State University and senior author of the study.

The study has been accepted for publication by the 2020 IEEE Symposium on Security and Privacy in May. The conference has moved online because of the global coronavirus (COVID-19) outbreak.

Typically, mobile apps engage with users by processing and responding to user input, Lin said. For instance, users often need to type certain words or sentences, or click buttons and slide screens. Those inputs prompt an app to perform different actions.

For this study, the research team evaluated 150,000 apps. They selected the top 100,000 based on the number of downloads from the Google Play store, the top 20,000 from an alternative market, and 30,000 from pre-installed apps on Android smartphones.

They found that 12,706 of those apps, about 8.5 percent, contained something the research team labeled "backdoor secrets" - hidden behaviors within the app that accept certain types of content to trigger behaviors unknown to regular users. They also found that some apps have built-in "master passwords," which allow anyone with that password to access the app and any private data contained within it. And some apps, they found, had secret access keys that could trigger hidden options, including bypassing payment.

"Both users and developers are all at risk if a bad guy has obtained these 'backdoor secrets,'" Lin said. In fact, he said, motivated attackers could reverse engineer the mobile apps to discover them.

Qingchuan Zhao, a graduate research assistant at Ohio State and lead author of this study, said that developers often wrongly assume reverse engineering of their apps is not a legitimate threat.

"A key reason why mobile apps contain these 'backdoor secrets' is because developers misplaced the trust," Zhao said. To truly secure their apps, he said, developers need to perform security-relevant user-input validations and push their secrets on the backend servers.

The team also found another 4,028 apps - about 2.7 percent - that blocked content containing specific keywords subject to censorship, cyber bullying or discrimination. That apps might limit certain types of content was not surprising - but the way that they did it was: validated locally instead of remotely, Lin said.

"On many platforms, user-generated content may be moderated or filtered before it is published," he said, noting that several social media sites, including Facebook, Instagram and Tumblr, already limit the content users are permitted to publish on those platforms.

"Unfortunately, there might exist problems - for example, users know that certain words are forbidden from a platform's policy, but they are unaware of examples of words that are considered as banned words and could result in content being blocked without users' knowledge," he said. "Therefore, end users may wish to clarify vague platform content policies by seeing examples of banned words."

In addition, he said, researchers studying censorship may wish to understand what terms are considered sensitive.
The team developed an open source tool, named InputScope, to help developers understand weaknesses in their apps and to demonstrate that the reverse engineering process can be fully automated.

Credit: 
Ohio State University

Studies find link between belief in conspiracy theories and political engagement

Some political movements, particularly those extremist in nature, are associated with belief in conspiracy theories. Antisemitic demagogues, for example, have long referred to The Protocols of the Elders of Zion to support their cause, in effect using for their purposes a conspiracy theory that is still widely believed although it has long been known that the text itself is a literary forgery. However, the role that a belief in conspiracies actually plays in political extremism and the willingness to use physical force has to date been disputed by psychologists. Researchers at Johannes Gutenberg University Mainz (JGU) have now investigated the possible link on the basis of two studies undertaken in Germany and the USA. The study subjects were asked to assume that the world is controlled by powerful secret societies. Faced with the prospect that practically all areas of society are dominated by such conspiratorial groups, the subjects declared themselves less willing to become involved in lawful political activities. Instead, they would resort to illegal, violent means.

Contradictory data on the political outlook of adherents of conspiracy theories

Researchers at the JGU Institute of Psychology had noticed that the views expressed by the specialists in this field differ widely on the relation between conspiracy beliefs and political engagement. On the one hand, it is postulated that conspiracy-based views could have a motivating influence and that the corresponding adherents are more likely to become actively involved in politics in order to bring about change. On the other hand, however, others propose that a belief in conspiracies tends to lead to disaffection and even withdrawal from politics.

The Mainz-based team headed by Professor Roland Imhoff decided to investigate this contradiction and examined whether and in what form there is a connection between belief in conspiracies and active political engagement. To this end, 138 study participants in Germany and 255 in the USA were asked to imagine three scenarios: They live in a society that is secretly governed by powerful groups, they live in a society in which it is possible that certain conspiracies exist, or they live in a society in which there is no real reason to suspect underhand machinations. They were then required to stipulate what sort of political stance they would take on the basis of 20 different suggestions. For example: "I would participate in an election by voting" or "I would try to influence the outcome of an election by hacking computers" or "I would carry out a violent attack on a person in a position of power".

The evaluation of the results showed how the apparent contradictions outlined above can be explained: There is a connection between the - in this case hypothetical - belief in conspiracy theories and the individual's political outlook, which when expressed in graph form produces an inverted U shape. This means that the willingness to engage in political activity reaches its peak among the mid-level adherents of conspiracy theories. Thereafter, the interest decreases again, especially when it comes to becoming actively engaged in legal means of political expression. Where there is an increasing conviction of being betrayed by the government, the tendency to resort to illegal, violent means increases. These tendencies were apparent in Germany as well as in the USA, although somewhat weaker in the US.

The results, as the authors write in their article in the journal Social Psychological and Personality Science, point to a real danger of conspiracy worldviews. "Once people are convinced of them, there is no need to pay allegiance to any form of social contract, as codified in laws and regulations or implicitly agreed on in forms of trust in epistemic authorities like quality media or university scientists." The social psychologists point out that there are clear limitations with regard to the two studies, most obviously with regard to the fact that the participants were asked to give hypothetical reactions to a hypothetical scenario. Thus, the conclusion that can be drawn is that belief in conspiracy theories may be associated with an attitude that assumes violent extremism to be an acceptable option.

Acceptance of an option will not necessarily result in concrete action

"We are by no means saying that belief in conspiracies leads to violent extremism," emphasized Professor Roland Imhoff. "Rather, what we are saying is that you might consider such an attitude acceptable even if as an outsider you put yourself in this world of thought." This is the first time that an experimental investigation has shown that political extremism and violence could be an almost logical could be an almost logical conclusion if one is convinced that secret conspiratorial powers control the world.

Credit: 
Johannes Gutenberg Universitaet Mainz

Caring for seniors during COVID-19 pandemic

image: Regenstrief Institute and Indiana University School of Medicine scientist Kathleen Unroe, M.D., MHA, and colleagues lay out guidelines and best practices for healthcare providers and family caregivers who are providing care for older adults during the COVID-19 pandemic. Their recommendations are published in the Journal of Geriatric Emergency Medicine.

Image: 
Regenstrief Institute

INDIANAPOLIS -- Older adults are at elevated risk for complications from COVID-19 and are dying at a higher rate than younger patients. In light of these concerns, Regenstrief Institute and Indiana University School of Medicine scientist Kathleen Unroe, M.D., MHA, and colleagues lay out guidelines and best practices for healthcare providers and family caregivers who are providing care for seniors during the COVID-19 pandemic. Their recommendations are published in the Journal of Geriatric Emergency Medicine.

"Our senior patients need additional measures of care and protection, and COVID-19 only exacerbates those needs," said Dr. Unroe, one of the authors on the paper. "Family care providers need to be aware of the hazards COVID-19 presents to their loved ones and understand how to mitigate them. I hope this information will provide helpful guidance to protect older adults during this crisis."

In the article, Dr. Unroe and her colleagues provide insight into several aspects of how the novel coronavirus is affecting the care of older adults.

Testing Seniors For COVID-19

Dr. Unroe and her colleagues highlight the need to prioritize testing for older adults even after increased screening capacity is available, due to their increased risk of complications from the disease. They advise health systems to make testing available in settings other than the emergency department whenever possible and to use options such as telecare in the screening process.

Caring for Older Adults

Older adults may be experiencing significant isolation already, and social distancing may worsen complications from seclusion, Dr. Unroe and her team note in the paper. Reduced time with caregivers may also place older adults at risk, due to missed opportunities to catch cognitive or general health decline and unrecognized falls.

If protective equipment is necessary for caregivers to wear around patients with cognitive decline, this change in appearance may be very disorienting for patients, and patients with dementia who are required to wear protective equipment may not understand why they are doing so. Dr. Unroe and her team recommend that clinicians provide additional resources to caregivers and steer them toward communities of support.

Symptoms

While fever and respiratory symptoms have been widely recognized as key symptoms associated with COVID-19, these symptoms often present differently in older adults. Fever, for example, may be blunted or absent entirely during infection for older adults. Respiratory symptoms may either be masked or exacerbated by co-occurring diseases, such as COPD, that can further worsen outcomes. Dr. Unroe and team point to the Infectious Disease Society of America's modified definition of fever for older adults as a helpful alternative:

A single oral temperature over 100°F, or

2 oral repeated temperatures over 99°F or

An increase in temperature of 2°F over the baseline temperature

The article also features key points, evidence and case studies surrounding other issues of elder care, including transitions in care between sites such as nursing homes and hospitals, guidance for triage, and suggestions for inpatient care systems changes.

"As our understanding of this virus continues to improve," said Dr. Unroe, "we must revise our practices of care, both clinically and residentially, to make sure that our most vulnerable populations are protected."

Credit: 
Regenstrief Institute

NASA, University of Nebraska release new global groundwater maps

image: Weekly maps of dry conditions in red and wet conditions in blue relative to the historic record are now available for the globe at three depths: surface soil moisture, root zone soil moisture, and groundwater, the latter shown in this global view.

Image: 
NASA / Scientific Visualization Studio

NASA researchers have developed new satellite-based, weekly global maps of soil moisture and groundwater wetness conditions and one to three-month U.S. forecasts of each product. While maps of current dry/wet conditions for the United States have been available since 2012, this is the first time they have been available globally.

"The global products are important because there are so few worldwide drought maps out there," said hydrologist and project lead Matt Rodell of NASA's Goddard Space Flight Center in Greenbelt, Maryland. "Droughts are usually well known when they happen in developed nations. But when there's a drought in central Africa, for example, it may not be noticed until it causes a humanitarian crisis. So it's valuable to have a product like this where people can say, wow, it's really dry there and no one's reporting it."

These maps are distributed online by the National Drought Mitigation Center at the University of Nebraska-Lincoln (UNL) to support U.S. and global drought monitoring.

"Being able to see a weekly snapshot of both soil moisture and groundwater is important to get a complete picture of drought," said professor Brian Wardlow, director for the Center for Advanced Land Management Information Technologies at UNL, who works closely with Rodell on developing remote sensing tools for operational drought monitoring.

Monitoring the wetness of the soil is essential for managing agricultural crops and predicting their yields, because soil moisture is the water available to plant roots. Groundwater is often the source of water for crop irrigation. It also sustains streams during dry periods and is a useful indicator of extended drought. But ground-based observations are too sparse to capture the full picture of wetness and dryness across the landscape like the combination of satellites and models can.

A Global Eye on Water

Both the global maps and the U.S. forecasts use data from NASA and German Research Center for Geosciences's Gravity Recovery and Climate Experiment Follow On (GRACE-FO) satellites, a pair of spacecraft that detect the movement of water on Earth based on variations of Earth's gravity field. GRACE-FO succeeds the highly successful GRACE satellites, which ended their mission in 2017 after 15 years of operation. With the global expansion of the product, and the addition of U.S. forecasts, the GRACE-FO data are filling in key gaps for understanding the full picture of wet and dry conditions that can lead to drought.

The satellite-based observations of changes in water distribution are integrated with other data within a computer model that simulates the water and energy cycles. The model then produces, among other outputs, time-varying maps of the distribution of water at three depths: surface soil moisture, root zone soil moisture (roughly the top three feet of soil), and shallow groundwater. The maps have a resolution of 1/8th degree of latitude, or about 8.5 miles, providing continuous data on moisture and groundwater conditions across the landscape.

The GRACE and GRACE-FO satellite-based maps are among the essential data sets used by the authors of the U.S. Drought Monitor, the premier weekly map of drought conditions for the United States that is used by the U.S. Department of Agriculture and the Federal Emergency Management Agency, among others, to evaluate which areas may need financial assistance due to losses from drought.

"GRACE [provided and GRACE-FO now provides] a national scope of groundwater," said climatologist and Drought Monitor author Brian Fuchs, at the drought center. He and the other authors use multiple data sets to see where the evidence shows conditions have gotten drier or wetter. For groundwater, that used to mean going to individual states' groundwater well data to update the weekly map. "It's saved a lot of time having that groundwater layer along with the soil moisture layers, all in one spot," Fuchs said. "The high-resolution data that we're able to bring in allows us to draw those contours of dryness or wetness right to the data itself."

One of the goals of the new global maps is to make the same consistent product available in all parts of the world--especially in countries that do not have any groundwater-monitoring infrastructure.

"Drought is really a key [topic]... with a lot of the projections of climate and climate change," Wardlow said. "The emphasis is on getting more relevant, more accurate and more timely drought information, whether it be soil moisture, crop health, groundwater, streamflow--[the GRACE missions are] central to this," he said. "These types of tools are absolutely critical to helping us address and offset some of the impacts anticipated, whether it be from population growth, climate change or just increased water consumption in general."

Both the Center for Advanced Land Management and the National Drought Mitigation Center are based in UNL's School of Natural Resources, and they are working with international partners, including the U.S. Agency for International Development and the World Bank, to develop and support drought monitoring using the GRACE-FO global maps and other tools in the Middle East, North Africa, South Africa, South East Asia, and India.

Droughts can be complex, both in timing and extent. At the surface, soil moisture changes rapidly with weather conditions. The moisture in the root zone changes a little slower but is still very responsive to weather. Lagging behind both is groundwater, since it is insulated from changes in the weather. But for longer-term outlooks on drought severity--or, conversely, flood risk in low-lying areas--groundwater is the metric to watch, said Rodell.

"The groundwater maps are like a slowed down, smoothed version of what you see at the surface," Rodell said. "They represent the accumulation of months or years of weather events." That smoothing provides a more complete picture of the overall drying or wetting trend going on in an area. Having an accurate accounting of groundwater levels is essential for accurately forecasting near-future conditions.

The new forecast product that projects dry and wet conditions 30, 60, and 90 days out for the lower 48 United States uses GRACE-FO data to help set the current conditions. Then the model runs forward in time using the Goddard Earth Observing System, Version 5 seasonal weather forecast model as input. The researchers found that including the GRACE-FO data made the resulting soil moisture and groundwater forecasts more accurate.

Since the product has just been rolled out, the user community is only just beginning to work with the forecasts, but Wardlow sees a huge potential.

"I think you'll see the GRACE-FO monitoring products used in combination with the forecasts," Wardlow said. "For example, the current U.S. product may show moderate drought conditions, and if you look at the forecast and the forecast shows next month that there's a continued drying trend, then that may change the decision versus if it was a wet trend."

The U.S. forecast and global maps are freely available to users through the drought center's data portal.

Credit: 
NASA/Goddard Space Flight Center

Medical manufacturers with female directors act more quickly and frequently on recalls

BLOOMINGTON, Ind. -- Medical product companies, such as those that make pharmaceuticals and medical devices, make recall decisions quite differently as women are added to their board of directors, according to a new study by professors at four universities, including Indiana University.

In life-threatening situations when defective medical products may kill a consumer, companies with female directors issue recalls much more quickly. For less severe instances in which there is greater discretion in the recall decision, recalls occur more frequently for companies that have women on their boards.

Both of these findings point to a more socially conscious and responsive decision-making culture regarding product quality when women are on a company's board of directors.

The study, published in the journal Manufacturing & Service Operations Management, is believed to be the first examining how female board representation relates to operations management, specifically in the context of product recall decision-making.

"Our study shows that there is a difference in real and important safety outcomes for consumers, between firms that have women on their boards and those who do not," said George Ball, assistant professor of operations and decision technologies at the Indiana University Kelley School of Business.

Other authors on the study are Kaitlin Wowak, assistant professor in the Department of Information, Analytics and Operations at the University of Notre Dame's Mendoza College of Business; Corinne Post, professor of management at Lehigh University; and Dave Ketchen, the Harbert Eminent Scholar at Auburn University.

Firms with female directors announced recalls of products with the most serious, life-threatening defects 28 days earlier than at firms where the board was all-male. This equates to a 35 percent reduction in time between when such firms were first made aware of a defect and when the decision was made to recall the product.

The researchers also found that companies with female directors initiated recalls of product defects that are less severe and easier to hide from regulators 120 percent more often. These situations often involved packaging and labeling issues. This is equal to 12 more of these type of recalls per firm.

A year ago, California became the first state to require that all public companies headquartered in the state have at least one female director. The European Commission mandates that all companies based in Europe have at least 40 percent female representation on their boards.

Previous research has suggested that women, compared to men, are more risk-averse, follow rules more closely and consider how their decisions influence a wider array of stakeholders.

While the addition of just one female director changed how recall decisions were made compared to firms with an all-male board, recall decisions continue to change as each additional woman is added to the board. Severe recalls are recalled more quickly, and discretionary low-severity recalls occur more often, as each additional female director was added.

The study relied on recall data obtained via a Freedom of Information Act request and recall timing data provided by the Food and Drug Administration. Researchers analyzed 4,271 medical product recalls from 2002 to 2013 across 92 publicly traded medical product firms regulated by the FDA.

Credit: 
Indiana University

New UC Davis research suggests parents should limit screen media for preschoolers

New research from University of California, Davis, suggests that parents should delay introducing their children to any screen media, as well as limit preschool-age children's use of mobile devices, including smartphones and tablets.

The research was published in the Journal of the American Medical Association Pediatrics this week. Over a two-and-a-half-year period, researchers assessed 56 children aged 32 to 47 months and surveyed their parents. The research team assessed children's self-regulation skills, or those skills needed to plan, control, and monitor their thoughts, feelings, and behaviors. Young children's self-regulation skills predict later academic success, social functioning, physical and mental health, income, and criminality.

Self-regulation skills were lower among children who began using any screen media devices (including television, computers, smartphones, and/or tablets) earlier in life, or who currently used mobile devices (smartphones and/or tablets) more often than others in the sample.

"Young children are often exposed to substantial amounts of screen media. Even though consumption of moderate amounts of high-quality children's media has been established to have a positive influence on development, the current findings support limiting children's use of mobile devices," said the study's primary author, Amanda C. Lawrence, a doctoral candidate in the Human Development Graduate Group at UC Davis. Co-authors are Daniel Ewon Choe, assistant professor of human development and family studies, and Madhuri S. Narayan, who was an undergraduate student when working on the research.

Devices also limit interaction time

Researchers voiced other reasons for cautious use of mobile devices by young children. "The portable nature of mobile devices allows them to be used in any location, such as while waiting for appointments, or in line at a grocery store. The screen use, then, could interfere with sensitive and responsive interactions with parents or practicing self-soothing behaviors that support optimal development," said Lawrence.

The research team recruited participants by handing out flyers at preschools and community events. Data were collected between July 1, 2016, and Jan. 11, 2019. During individual 90-minute visits to an on-campus research laboratory, children were asked to complete 10 tasks to evaluate their ability to self-regulate. Tasks were as varied as walking a line slowly, taking turns with the researcher in building a tower out of blocks, and delaying gratification -- for example, being asked to hold off unwrapping a gift while the researcher briefly left the room. Parents were asked about screen use using a novel survey designed by Lawrence, and researchers calculated the children's reported age at first use of screen media and average time spent per week on each device.

Other findings include:

There was substantial variation in the amount of time children spent with screen media devices in the average week in this community sample. Screen time for traditional devices (television, computers) ranged from 0 to 68 hours per week, and 0 to 14 hours per week for mobile devices (tablets, smartphones).

Children's screen time in the average week was not related to their family's income in this sample, but children growing up in higher-income households started using mobile devices at a younger age than lower-income households.

Screen time also did not differ by racial/ethnic minority status in this sample.

Additionally, children's exposure to what the researchers consider traditional screen devices (televisions, computers) in the average week was not related to their self-regulation, in contrast to most previous research. Lawrence speculates that messaging about providing child-directed, educational content and cautioning parents to monitor children's viewing has reached parents and has been effective, at least among some groups.

This is a small study, but the beginning of a long-term longitudinal study of children's development of self-regulation and looking at all screen media devices over multiple years with more children and parents, researchers said.

Credit: 
University of California - Davis

Sediments may control location, magnitude of megaquakes

Boulder, Colo., USA: The world's most powerful earthquakes strike at subduction zones, areas where enormous amounts of stress build up as one tectonic plate dives beneath another. When suddenly released, this stress can cause devastating "megaquakes" like the 2011 Mw 9.0 Tohoku event, which killed nearly 16,000 people and crippled Japan's Fukushima Dai-ichi Nuclear Power Plant. Now a study published in Geology suggests that sediments atop the downgoing slab can play a key role in determining the magnitude and location of these catastrophic events.

In this newly published study, a team led by Gou Fujie, a senior scientist at the Japan Agency for Marine-Earth Science and Technology, used a trio of geophysical methods to image the subducting sediments in the northeastern Japan arc, where the Tohoku event occurred. The findings suggest that variations caused by volcanic rocks intruded into these sediments can substantially influence the nature of subduction zone earthquakes.

"Our imaging shows that the enormous amount of slip that occurred during the 2011 Tohoku earthquake stopped in an area of thin sediments that are just starting to subduct," says Fujie. "These results indicate that by disturbing local sediment layers, volcanic activity that occurred prior to subduction can affect the size and the distribution of interplate earthquakes after the layers have been subducted."

Researchers first began to suspect that variations in subducting sediments could influence megaquakes after the 2011 Tohoku event, when international drilling in the northeastern Japan arc showed that giant amounts of slip during the earthquake occurred in a slippery, clay-rich layer located within the subducting sediments. To better understand the nature of the downgoing slab in this region, Fujie's team combined several imaging techniques to paint a clearer picture of the subseafloor structure.

The researchers discovered there are what Fujie calls "remarkable regional variations" in the sediments atop the downgoing plate, even where the seafloor topography seems to be flat. There are places, he says, where the sediment layer appears to be extremely thin due to the presence of an ancient lava flow or other volcanic rocks. These volcanic intrusions have heavily disturbed, and in places thermally metamorphosed, the clay layer in which much of the seismic slip occurred.

Because the type of volcanism that caused sediment thinning in the northeastern Japan arc has also been found in many areas, says Fujie, the research suggests such thinning is ubiquitous--and that this type of volcanic activity has also affected other seismic events. "Regional variations in sediments atop descending oceanic plates appear to strongly influence devastating subduction zone earthquakes," he concludes.

Credit: 
Geological Society of America

Scientists tap unused energy source to power smart sensor networks

image: A team of scientists has developed a new mechanism to harvest stray magnetic fields all around us and convert the energy into useful, usable electricity.

Image: 
Kai Wang

The electricity that lights our homes and powers our appliances also creates small magnetic fields that are present all around us. Scientists have developed a new mechanism capable of harvesting this wasted magnetic field energy and converting it into enough electricity to power next-generation sensor networks for smart buildings and factories.

"Just like sunlight is a free source of energy we try to harvest, so are magnetic fields," said Shashank Priya, professor of materials science and engineering and associate vice president for research at Penn State. "We have this ubiquitous energy present in our homes, office spaces, work spaces and cars. It's everywhere, and we have an opportunity to harvest this background noise and convert it to useable electricity."

A team led by Penn State scientists developed a device that provides 400 percent higher power output compared to other state-of-the-art technology when working with low-level magnetic fields, like those found in our homes and buildings.

The technology has implications for the design of smart buildings, which will require self-powered wireless sensor networks to do things like monitor energy and operational patterns and remotely control systems, the scientists said.

"In buildings, it's known that if you automate a lot of functions, you could actually improve the energy efficiency very significantly," Priya said. "Buildings are one of the largest consumers of electricity in the United States. So even a few percent drop in energy consumption could represent or translate into megawatts of savings. Sensors are what will make it possible to automate these controls, and this technology is a realistic way to power those sensors."

Researchers designed paper-thin devices, about 1.5 inches long, that can be placed on or near appliances, lights, or power cords where the magnetic fields are strongest. These fields quickly dissipate away from the source of flowing electric current, the scientists said.

When placed 4 inches from a space heater, the device produced enough electricity to power 180 LED arrays, and at 8 inches, enough to power a digital alarm clock. The scientists reported the findings in the journal Energy and Environmental Science.

"These results provide significant advancements toward sustainable power for integrated sensors and wireless communication systems," said Min Gyu Kang, an assistant research professor at Penn State and co-lead author on the study.

The scientists used a composite structure, layering two different materials together. One of these materials is magnetostrictive, which converts a magnetic field into stress, and the other is piezoelectric, which converts stress, or vibrations, into an electric field. The combination allows the device to turn a magnetic field into an electric current.

The device has a beam-like structure with one end clamped and the other free to vibrate in response to an applied magnetic field. A magnet mounted at the free end of the beam amplifies the movement and contributes toward a higher production of electricity, the scientists said.

"The beauty of this research is it uses known materials, but designs the architecture for basically maximizing the conversion of the magnetic field into electricity," Priya said. "This allows for achieving high power density under low amplitude magnetic fields."

Rammohan Sri Ramdas, an assistant research professor at Penn State, participated in the research.

Also contributing were Hyeon Lee and Prashant Kumar, research assistants at Virginia Tech, and Mohan Sanghadasa, senior research scientist at the Aviation and Missile Center, U.S. Army Combat Capabilities Development Command.

Some of the team members in this study were funded through the Office of Naval Research and the others through National Science Foundation.

Credit: 
Penn State

Hubble finds best evidence for elusive mid-size black hole

image: This artist's impression depicts a star being torn apart by an intermediate-mass black hole (IMBH), surrounded by an accretion disc. This thin, rotating disc of material consists of the leftovers of a star which was ripped apart by the tidal forces of the black hole.

Image: 
ESA/Hubble, M. Kornmesser

New data from the NASA/ESA Hubble Space Telescope have provided the strongest evidence yet for mid-sized black holes in the Universe. Hubble confirms that this "intermediate-mass" black hole dwells inside a dense star cluster.

Intermediate-mass black holes (IMBHs) are a long-sought "missing link" in black hole evolution. There have been a few other IMBH candidates found to date. They are smaller than the supermassive black holes that lie at the cores of large galaxies, but larger than stellar-mass black holes formed by the collapse of massive stars. This new black hole is over 50 000 times the mass of our Sun.

IMBHs are hard to find. "Intermediate-mass black holes are very elusive objects, and so it is critical to carefully consider and rule out alternative explanations for each candidate. That is what Hubble has allowed us to do for our candidate," said Dacheng Lin of the University of New Hampshire, principal investigator of the study (1).

Lin and his team used Hubble to follow up on leads from NASA's Chandra X-ray Observatory and the European Space Agency's X-ray Multi-Mirror Mission (XMM-Newton), which carries three high-throughput X-ray telescopes and an optical monitor to make long uninterrupted exposures providing highly sensitive observations.

"Adding further X-ray observations allowed us to understand the total energy output," said team member Natalie Webb of the Université de Toulouse in France. "This helps us to understand the type of star that was disrupted by the black hole."

In 2006 these high-energy satellites detected a powerful flare of X-rays, but it was not clear if they originated from inside or outside of our galaxy. Researchers attributed it to a star being torn apart after coming too close to a gravitationally powerful compact object, like a black hole.

Surprisingly, the X-ray source, named 3XMM J215022.4?055108, was not located in the centre of a galaxy, where massive black holes normally reside. This raised hopes that an IMBH was the culprit, but first another possible source of the X-ray flare had to be ruled out: a neutron starin our own Milky Way galaxy, cooling off after being heated to a very high temperature. Neutron stars are the extremely dense remnants of an exploded star.

Hubble was pointed at the X-ray source to resolve its precise location. Deep, high-resolution imaging confirmed that the X-rays emanated not from an isolated source in our galaxy, but instead in a distant, dense star cluster on the outskirts of another galaxy -- just the sort of place astronomers expected to find evidence for an IMBH. Previous Hubble research has shown that the more massive the galaxy, the more massive its black hole. Therefore, this new result suggests that the star cluster that is home to 3XMM J215022.4?055108 may be the stripped-down core of a lower-mass dwarf galaxy that has been gravitationally and tidally disrupted by its close interactions with its current larger galaxy host.

IMBHs have been particularly difficult to find because they are smaller and less active than supermassive black holes; they do not have readily available sources of fuel, nor do they have a gravitational pull that is strong enough for them to be constantly drawing in stars and other cosmic material and producing the tell-tale X-ray glow. Astronomers therefore have to catch an IMBH red-handed in the relatively rare act of gobbling up a star. Lin and his colleagues combed through the XMM-Newton data archive, searching hundreds of thousands of sources to find strong evidence for this one IMBH candidate. Once found, the X-ray glow from the shredded star allowed astronomers to estimate the black hole's mass.

Confirming one IMBH opens the door to the possibility that many more lurk undetected in the dark, waiting to be given away by a star passing too close. Lin plans to continue this meticulous detective work, using the methods his team has proved successful.

"Studying the origin and evolution of the intermediate mass black holes will finally give an answer as to how the supermassive black holes that we find in the centres of massive galaxies came to exist," added Webb.

Black holes are one of the most extreme environments humans are aware of, and so they are a testing ground for the laws of physics and our understanding of how the Universe works. Does a supermassive black hole grow from an IMBH? How do IMBHs themselves form? Are dense star clusters their favoured home? With a confident conclusion to one mystery, Lin and other black hole astronomers find they have many more exciting questions to pursue.

Credit: 
ESA/Hubble Information Centre

Flooding stunted 2019 cropland growing season, resulting in more atmospheric CO2

image: Researchers are using satellite and aircraft observations to monitor regional land carbon fluxes in near real-time, as illustrated in this artist's concept. Satellite observations of solar-induced chlorophyll fluorescence (SIF) were used to track photosynthesis and estimate corresponding changes in land surface carbon fluxes. Meanwhile, atmospheric CO2 concentrations, which are influenced by the land surface carbon fluxes, can be observed by aircraft and from space. In this illustration, the two satellites depicted from left to right are: TROPOMI (TROPOspheric Monitoring Instrument) and OCO-2 (Orbiting Carbon Observatory-2). The aircraft is the ACT-America (Atmospheric Carbon and Transport - America).

Image: 
NASA/JPL-Caltech

Severe flooding throughout the Midwest--which triggered a delayed growing season for crops in the region--led to a reduction of 100 million metric tons of net carbon uptake during June and July of 2019, according to a new study.

For reference, the massive California wildfires of 2018 released an estimated 12.4 million metric tons of carbon into the atmosphere. And although part of this deficit due to floods was compensated for later in the growing season, the combined effects are likely to have resulted in a 15 percent reduction in crop productivity relative to 2018, the study authors say.

The study, published March 31, 2020, in the journal AGU Advances, describes how the carbon uptake was measured using satellite data. Researchers used a novel marker of photosynthesis known as solar-induced fluorescence to quantify the reduced carbon uptake due to the delay in the crops' growth. Independent observations of atmospheric CO2 levels were then employed to confirm the reduction in carbon uptake.

"We were able to show that it's possible to monitor the impacts of floods on crop growth on a daily basis in near real time from space, which is critical to future ecological forecasting and mitigation," says Yi Yin, research scientist at Caltech and lead author of the study.

Record rainfalls soaked the Midwest during the spring and early summer of 2019. For three consecutive months (April, May, and June), the National Oceanic and Atmospheric Administration reported that 12-month precipitation measurements had hit all-time highs. The resulting floods not only damaged homes and infrastructure but also impacted agricultural productivity, delaying the planting of crops in large parts of the Corn Belt, which stretches from Kansas and Nebraska in the west to Ohio in the east.

To assess the environmental impact of the delayed growing season, scientists at Caltech and JPL, which Caltech manages for NASA, turned to satellite data. As plants convert carbon dioxide (CO2) and sunlight into oxygen and energy through photosynthesis, a small amount of the sunlight they absorb is emitted back in the form of a very faint glow. The glow, known as solar-induced fluorescence, or SIF, is far too dim for us to see with bare eyes, but it can be measured through a process called satellite spectrophotometry.

The Caltech-JPL team quantified SIF using measurements from a European Space Agency (ESA) satellite-borne instrument to track the growth of crops with unprecedented detail. They found that the seasonal cycle of the 2019 crop growth was delayed by around two weeks and the maximum seasonal photosynthesis was reduced by about 15 percent. The stunted growing season was estimated to have led to a reduction in carbon uptake by plants of around 100 million metric tons from June to July 2019.

"SIF is the most accurate signal of photosynthesis by far that can be observed from space," says Christian Frankenberg, professor of environmental science and engineering at Caltech. "And since plants absorb carbon dioxide during photosynthesis, we wanted to see if SIF could track the reductions in crop carbon uptake during the 2019 floods."

To find out, the team analyzed atmospheric CO2 measurements from NASA's Orbiting Carbon Observatory-2 (OCO-2) satellite as well as from aircraft from NASA's Atmospheric Carbon and Transport America (ACT-America) project. "We found that the SIF-based estimates of reduced uptake are consistent with elevated atmospheric CO2 when the two quantities are connected by atmospheric transport models," says Brendan Bryne, co-corresponding author of the study and a NASA postdoc fellow at JPL.

"This study illuminates our ability to monitor the ecosystem and its impact on atmospheric CO2 in near real time from space. These new tools allow for global sensing of biospheric uptake of carbon dioxide," says Paul Wennberg, the R. Stanton Avery Professor of Atmospheric Chemistry and Environmental Science and Engineering, director of the Ronald and Maxine Linde Center for Global Environmental Science, and founding member of the Orbiting Carbon Observatory project.

Credit: 
California Institute of Technology

Fracking chemical may interfere with male sex hormone receptor

WASHINGTON--A chemical used in hydraulic fracturing, commonly called fracking, has the potential to interfere with reproductive hormones in men, according to research accepted for presentation at ENDO 2020, the Endocrine Society's annual meeting, and publication in a special supplemental section of the Journal of the Endocrine Society.

The study found the chemical can block the effects of testosterone and other male sex hormones known as androgens.

"Possible adverse health outcomes associated with anti-androgen exposure are abnormal reproductive function, male infertility and disrupted testicular and prostate development," said lead researcher Phum Tachachartvanich, Ph.D., of the University of California, Davis in Davis, Calif.

Hydraulic fracturing technology has significantly improved the yield of oil and natural gas extraction from unconventional sources. Fracking involves drilling and hydraulic extraction by injecting mixtures of industrial chemicals at high pressure into horizontal bore wells. Fracking chemicals contaminate the environment, including lake, groundwater and wastewater, and they are likely to affect everyone that is exposed to this group of chemicals, according to Tachachartvanich.

"The widespread use of fracking has led to concerns of potential negative impacts on both the environment and human health," Tachachartvanich said. "Everyone should be concerned about fracking as the wastewater generated has potential endocrine-disrupting effects, which can adversely affect the general population."

The researchers used a computer model to rank 60 hydraulic fracturing chemicals used in California, based on the predicted potential of each chemical to interfere with androgens' ability to bind with cells in the body. Based on the rankings, they used a cell model to verify the top five fracking chemicals that showed the highest potential to interfere with this process.

They then measured the androgen binding activity in the cell model for each chemical. Of the five HF chemicals tested, only one - Genapol-X100--significantly inhibited androgen binding activity. "This suggests Genapol-X100 has endocrine-disrupting abilities," Tachachartvanich said.

Credit: 
The Endocrine Society

Most internists-in-training feel ill-equipped to treat obesity

WASHINGTON--Most resident physicians training in internal medicine do not feel adequately prepared to manage obesity in their patients, a new survey from a California residency program finds. The results were accepted for presentation at ENDO 2020, the Endocrine Society's annual meeting, and will be published in a special supplemental section of the Journal of the Endocrine Society.

"We are not training our next generation of doctors to feel comfortable with or knowledgeable about management of obesity, a disease rapidly increasing in prevalence that underlies many other medical conditions," said lead researcher Mita Shah Hoppenfeld, M.D. Hoppenfeld is a fourth-year internal medicine resident at Stanford University in Palo Alto, Calif., where she conducted the survey.

Hoppenfeld said their findings are concerning given that more than 42% of U.S. adults have obesity, according to recent statistics from the Centers for Disease Control and Prevention. Also of concern: Doctors completing an internal medicine residency will be on the front lines of treating patients with obesity and related complications in fields spanning primary care, endocrinology, cardiology and many others.

Although studies have found that practicing internists are ill-equipped to approach the topic of weight loss with patients, Hoppenfeld said research on residents' obesity management is scarce. To learn about residents' comfort, knowledge and practices managing obesity in their primary care clinics, she and her colleagues sent an electronic survey to all 125 Stanford internal medicine resident physicians at multiple clinical sites. Seventy residents, or 56%, responded.

The researchers found:

Although 81% of resident physicians described feeling comfortable or somewhat comfortable with counseling patients about lifestyle changes such as diet and exercise, only 33% reported consistently providing such counseling.

Barriers to providing lifestyle counseling included lack of time (93%), poor familiarity with resources (50%), and lack of training in motivational interviewing (36%). The top barrier (84%) to prescribing weight loss medications was unfamiliarity with them.

Nearly one-third (31%) of residents correctly identified medically advisable indications for bariatric (weight loss) surgery, but only 9% of those reported referring patients they considered appropriate for surgery.

When residents reported greater comfort with managing obesity, they were significantly more likely to take action.

Most residents wanted their training to include more information about weight management medications (90%) and referrals for obesity specialty care (77%).

"The lack of comfort with obesity management occurred at all levels of training," Hoppenfeld said. "Our findings suggest that increasing residents' education in obesity management may improve care for patients with obesity. We need to improve medical training to include specific, evidence-based teaching on management of obesity as a disease."

Credit: 
The Endocrine Society

Artificial intelligence could help predict future diabetes cases

WASHINGTON--A type of artificial intelligence called machine learning can help predict which patients will develop diabetes, according to an ENDO 2020 abstract that will be published in a special supplemental section of the Journal of the Endocrine Society.

Diabetes is linked to increased risks of severe health problems, including heart disease and cancer. Preventing diabetes is essential to reduce the risk of illness and death. "Currently we do not have sufficient methods for predicting which generally healthy individuals will develop diabetes," said lead author Akihiro Nomura, M.D., Ph.D., of the Kanazawa University Graduate School of Medical Sciences in Kanazawa, Japan.

The researchers investigated the use of a type of artificial intelligence called machine learning in diagnosing diabetes. Artificial intelligence (AI) is the development of computer systems able to perform tasks that normally require human intelligence. Machine learning is a type of AI that enables computers to learn without being explicitly programmed. With each exposure to new data, a machine-learning algorithm grows increasingly better at recognizing patterns over time.

"Using machine learning, it could be possible to precisely identify high-risk groups of future diabetes patients better than using existing risk scores," Nomura said. "In addition, the rate of visits to healthcare providers might be improved to prevent future onset of diabetes."

Nomura and colleagues analyzed 509,153 nationwide annual health checkup records from 139,225 participants from 2008 to 2018 in the city of Kanazawa. Among them, 65,505 participants without diabetes were included.

The data included physical exams, blood and urine tests and participant questionnaires. Patients without diabetes at the beginning of the study who underwent more than two annual health checkups during this period were included. New cases of diabetes were recorded during patients' checkups.

The researchers identified a total of 4,696 new diabetes patients (7.2%) in the study period. Their trained computer model predicted the future incidence of diabetes with an overall accuracy of 94.9%.

Nomura says he next plans to perform clinical trials to assess the effectiveness of using statins to treat groups of patients identified by the machine learning model as being at high risk of developing diabetes.

Credit: 
The Endocrine Society

Discovery of new biomarker in blood could lead to early test for Alzheimer's disease

Researchers at the University of California San Diego discovered that high blood levels of RNA produced by the PHGDH gene could serve as a biomarker for early detection of Alzheimer's disease. The work could lead to the development of a blood test to identify individuals who will develop the disease years before they show symptoms.

The team published their findings in Current Biology.

The PHGDH gene produces RNA and proteins that are critical for brain development and function in infants, children and adolescents. As people get older, the gene typically ramps down its production of these RNAs and proteins. The new study, led by Sheng Zhong, a professor of bioengineering at the UC San Diego Jacobs School of Engineering in collaboration with Dr. Edward Koo, a professor of neuroscience at the UC San Diego School of Medicine, suggests that overproduction of a type of RNA, called extracellular RNA (exRNA), by the PHGDH gene in the elderly could provide an early warning sign of Alzheimer's disease.

"Several known changes associated with Alzheimer's disease usually show up around the time of clinical diagnosis, which is a little too late. We had a hunch that there is a molecular predictor that would show up years before, and that's what motivated this study," Zhong said.

The discovery was made possible thanks to a technique developed by Zhong and colleagues that is sensitive enough to sequence tens of thousands of exRNAs in less than one drop of blood. The method, dubbed SILVER-SEQ, was used to analyze the exRNA profiles in blood samples of 35 elderly individuals 70 years and older who were monitored up to 15 years prior to death. The subjects consisted of 15 patients with Alzheimer's disease; 11 "converters," which are subjects who were initially healthy then later developed Alzheimer's; and 9 healthy controls. Clinical diagnoses were confirmed by analysis of post-mortem brain tissue.

The results showed a steep increase in PHGDH exRNA production in all converters approximately two years before they were clinically diagnosed with Alzheimer's. PHGDH exRNA levels were on average higher in Alzheimer's patients. They did not exhibit an increasing trend in the controls, except for in one control that became classified as a converter.

The researchers note some uncertainty regarding the anomalous converter. Since the subject died sometime during the 15-year monitoring, it is unclear whether that individual would have indeed developed Alzheimer's if he or she lived longer, Zhong said.

The team acknowledges additional limitations of the study.

"This is a retrospective study based on clinical follow-ups from the past, not a randomized clinical trial on a larger sample size. So we are not yet calling this a verified blood test for Alzheimer's disease," said co-first author Zixu Zhou, a bioengineering alumnus from Zhong's lab who is now at Genemo Inc., a startup founded by Zhong. "Nevertheless, our data, which were from clinically collected samples, strongly support the discovery of a biomarker for predicting the development of Alzheimer's disease."

In addition to randomized trials, future studies will include testing if the PHGDH biomarker can be used to identify patients who will respond to drugs for Alzheimer's disease.

The team is also open to collaborating with Alzheimer's research groups that might be interested in testing and validating this biomarker.

"If our results can be replicated by other centers and expanded to more cases, then it suggests that there are biomarkers outside of the brain that are altered before clinical disease onset and that these changes also predict the possible onset or development of Alzheimer's disease," Koo said. "If this PDGDH signal is shown to be accurate, it can be quite informative for diagnosis and even treatment response for Alzheimer's research."

Credit: 
University of California - San Diego

'Tequila' powered biofuels more efficient than corn or sugar

image: Blue agave (Agave tequilana)

Image: 
University of Sydney

The agave plant used to make tequila could be established in semi-arid Australia as an environmentally friendly solution to Australia's transport fuel shortage, a team of researchers at the University of Sydney, University of Exeter and University of Adelaide has found.

The efficient, low-water process could also help produce ethanol for hand sanitiser, which is in high demand during the COVID-19 pandemic.

In an article published this week in the Journal of Cleaner Production, University of Sydney agronomist Associate Professor Daniel Tan with international and Australian colleagues have analysed the potential to produce bioethanol (biofuel) from the agave plant, a high-sugar succulent widely grown in Mexico to make the alcoholic drink tequila.

The agave plant is now being grown as a biofuel source on the Atherton Tablelands in Far North Queensland by MSF Sugar, and it promises some significant advantages over existing sources of bioethanol such as sugarcane and corn, Associate Professor Tan said.

"Agave is an environmentally friendly crop that we can grow to produce ethanol-based fuels and healthcare products," said Associate Professor Tan from the Sydney Institute of Agriculture.

"It can grow in semi-arid areas without irrigation; and it does not compete with food crops or put demands on limited water and fertiliser supplies. Agave is heat and drought tolerant and can survive Australia's hot summers."

Associate Professor Tan assembled the research team and led its economic analysis.

Lead author Dr Xiaoyu Yan from the University of Exeter, who led the lifecycle assessment, said: "Our analysis highlights the possibilities for bioethanol production from agave grown in semi-arid Australia, causing minimum pressure on food production and water resources.

"The results suggest that bioethanol derived from agave is superior to that from corn and sugarcane in terms of water consumption and quality, greenhouse gas emissions, as well as ethanol output."

This study used chemical analyses of agave from a pilot agave farm in Kalamia Estate, Queensland (near Ayr) undertaken by Dr Kendall Corbin for her University of Adelaide PhD, supervised by Professor Rachel Burton.

"It is fabulous that the results of my chemical analysis can be used in both an economic and environmental footprint study and have real-world applications", Dr Corbin said.

"The economic analysis suggests that a first generation of bioethanol production from agave is currently not commercially viable without government support, given the recent collapse in the world oil price," Associate Professor Tan said. "However, this may change with the emerging demand for new ethanol-based healthcare products, such as hand sanitisers."

"This is the first comprehensive lifecycle assessment and economic analysis of bioethanol produced from a five-year agave field experiment in north Queensland. Our analysis shows a bioethanol yield of 7414 litres a hectare each year is achievable with five-year-old agave plants."

The study found that sugarcane yields 9900 litres a hectare each year. However, agave outperforms sugarcane on a range of measures, including freshwater eutrophication, marine ecotoxicity and - crucially - water consumption.

Agave uses 69 percent less water than sugarcane and 46 percent less water than corn for the same yield. For US corn ethanol, the yield was lower than agave, at 3800 litres a hectare a year.

"This shows agave is an economic and environmental winner for biofuel production in the years to come," Associate Professor Tan said.

Credit: 
University of Sydney