Tech

Trust me, I'm a chatbot

image: In this research study, the test subjects chatted with a chatbot - but only half of them knew that it was a non-human conversation partner.

Image: 
Mozafari

More and more companies are using chatbots in customer services. Due to advances in artificial intelligence and natural language processing, chatbots are often indistinguishable from humans when it comes to communication. But should companies let their customers know that they are communicating with machines and not with humans? Researchers at the University of Göttingen investigated. Their research found that consumers tend to react negatively when they learn that the person they are talking to is, in fact, a chatbot. However, if the chatbot makes mistakes and cannot solve a customer's problem, the disclosure triggers a positive reaction. The results of the study were published in the Journal of Service Management.

Previous studies have shown that consumers have a negative reaction when they learn that they are communicating with chatbots - it seems that consumers are inherently averse to the technology. In two experimental studies, the Göttingen University team investigated whether this is always the case. Each study had 200 participants, each of whom was put into the scenario where they had to contact their energy provider via online chat to update their address on their electricity contract following a move. In the chat, they encountered a chatbot - but only half of them were informed that they were chatting online with a non-human contact. The first study investigated the impact of making this disclosure depending on how important the customer perceives the resolution of their service query to be. In a second study, the team investigated the impact of making this disclosure depending on whether the chatbot was able to resolve the customer's query or not. To investigate the effects, the team used statistical analyses such as covariance and mediation analysis.

The result: most noticeably, if service issues are perceived as particularly important or critical, there is a negative reaction when it is revealed that the conversation partner is a chatbot. This scenario weakens customer trust. Interestingly, however, the results also show that disclosing that the contact was a chatbot leads to positive customer reactions in cases where the chatbot cannot resolve the customer's issue. "If their issue isn't resolved, disclosing that they were talking with a chatbot, makes it easier for the consumer to understand the root cause of the error," says first author Nika Mozafari from the University of Göttingen. "A chatbot is more likely to be forgiven for making a mistake than a human." In this scenario, customer loyalty can even improve.

Credit: 
University of Göttingen

Chinese health insurance achieves success decreasing diabetes medication usage, costs

Approximately 642 million people are expected to be diagnosed with diabetes by 2040, with Asians representing more than 55% of cases. Researchers conducted the first large-scale study since the implementation of medical insurance in China to evaluate the complexity and cost of drug therapy for Asian people with diabetes. They used available treatment records from Beijing's medical insurance bureau from 2016 to 2018 and looked at five outcomes, including: 1) quantity of outpatient medications, 2) number of co-morbidities diagnosed, 3) estimated annual cost of the outpatient drug regimen, 4) drug therapy strategies for diabetic patients and 5) the most commonly prescribed drug class in the patient cohort. They found that over three years, there was a gradual decrease of almost 9% decrease in the average quantity of diabetes medications. The mean usage of both anti-glycemic and non-antiglycemic drugs decreased by 3.6% and 12.8%, respectively. Researchers found an 18.39% decrease in estimated annual medication costs. The decrease in medical costs could be due to rational use of medications, leading to a decrease in the usage of medications over the three years. This is especially true for what the authors call the needless use of most types of insulin. This could have indirectly led to decreased costs. China's health insurance appears to have achieved "remarkable" success. The study authors advise that therapeutic drugs should be selected with caution according to the diet and lifestyle of each individual.

Credit: 
American Academy of Family Physicians

Floating into summer with more buoyant, liquid-proof life jackets, swimsuits (video)

image: A new one-step method creates a liquid-proof, more buoyant cotton fabric for life jackets.

Image: 
American Chemical Society

Summertime is here, and that often means long, lazy days at the beach, water skiing and swimming. Life jackets and swimsuits are essential gear for these activities, but if not dried thoroughly, they can develop a gross, musty smell. Now, researchers reporting in ACS Applied Materials & Interfaces have developed a one-step method to create a buoyant cotton fabric for these applications that is also oil- and water-repellant. Watch a video of the fabric here.

Waterproof and oil-proof fabrics are in high demand for recreational water activities because of their low drag and self-cleaning properties. And while cotton is a popular fabric, it's hydrophilic, so most liquids and dirt can easily mess it up. To improve cotton's impermeability, previous researchers developed superamphiphobic coatings that were extremely water- and oil-repellant. But because they required multiple time-consuming steps to apply, these coatings were impractical for large-scale manufacturing. Others incorporated nanoparticles into their formulas, but there are concerns about these particles sloughing off and potentially harming the environment. Xiao Gong and Xinting Han wanted to develop a simple way to make a coating for cotton fabric so it would have superb liquid-repulsion properties and hold up in many challenging circumstances.

The researchers optimized a one-step process for a liquid-proof coating by mixing dopamine hydrochloride, 3-aminopropyltriethoxysilane and 1H,1H,2H,2H-perfluorodecyltriethoxysilane with a piece of cotton fabric for 24 hours. The three-part solution developed into a uniform, dark brown coating on the fabric. In tests, the treated cotton was impervious to many common liquids. The new solution also coated inner cotton fibers, making them liquid proof, too. In other tests, only strong acid and repeated washings reduced the material's water and oil resistance, respectively. Treated fabric soiled with fine sand was easy to clean with water, whereas water only wetted the control version. Finally, the material stayed afloat with up to 35 times its weight on it because of nanoscale air pockets that formed where the coating attached to the fabric, the researchers explain. They say their durable cotton fabric has great potential for applications where drag reduction and increased buoyancy are important, including swimsuits and life jackets.

Credit: 
American Chemical Society

Detecting wildlife illness and death with new early alert system

image: Wildlife rehabilitation specialists from UC Davis Oiled Wildlife Care Network and International Bird Rescue treat a common murre at the San Francisco Bay Oiled Wildlife Care and Education Center in Fairfield, California, in 2015.

Image: 
Gregory Urquiaga/UC Davis

From domoic acid poisoning in seabirds to canine distemper in raccoons, wildlife face a variety of threats and illnesses. Some of those same diseases make their way to humans and domestic animals in our increasingly shared environment.

A new early detection surveillance system for wildlife helps identify unusual patterns of illness and death in near real-time by tapping into data from wildlife rehabilitation organizations across California. This system has the potential to expand nationally and globally. It was created by scientists at the University of California Davis' School of Veterinary Medicine with partners at the California Department of Fish and Wildlife and the nonprofit Wild Neighbors Database Project.

The Wildlife Morbidity and Mortality Event Alert System is described in a study published today in the journal Proceedings of the Royal Society B.

"Human-induced disturbances are contributing to a wide range of threats -- habitat loss, invasive species introductions, pollution, disease, wildfires," said co-lead author Terra Kelly, a wildlife epidemiologist at the UC Davis One Health Institute and its Karen C. Drayer Wildlife Health Center within the School of Veterinary Medicine. "It speaks to the need for a system like this where we can better understand the threats facing wildlife populations and respond to them in a timely way so there's less harm to wildlife."

FRONT-LINE RESPONDERS FOR WILDLIFE

Wildlife rehabilitation workers are the front-line responders of the free-ranging animal world. They are the first to receive and tend to sick and injured wild animals. Their clinical reports carry a wealth of information that, when shared, can indicate broader patterns.

Until recently, such clinical reports were stored primarily on paper or isolated computer files. In 2012, Wild Neighbors Database Project co-founders Devin Dombrowski and Rachel Avilla created the Wildlife Rehabilitation Medical Database, or WRMD, a free online tool now used by more than 950 rehabilitation organizations across 48 states and 19 countries to monitor patient care.

Dombrowski and Avilla brought the tool to CDFW, which connected with long-standing partners at UC Davis to pilot an alert system using the database as its foundation.

"I'm thrilled that WRMD is not only useful for thousands of wildlife rehabilitators but that the data collected by them is used for morbidity and mortality monitoring," co-author Dombrowski said. "To witness the WMME Alert System identifying data anomalies and alerting investigators is incredible."

The CDFW is using the system to help identify and prioritize wildlife needs and conservation efforts.

"The near real-time information this system provides has allowed us to quickly follow up with diagnostic testing to identify the problem," said Krysta Rogers, senior environmental scientist at the CDFW's Wildlife Health Laboratory. "This system also has been instrumental in determining the geographic range and severity of the threat."

HOW IT WORKS

To test the system, the scientists analyzed 220,000 case records collected between early 2013 to late 2018 to establish thresholds for triggering alerts. The dataset included records from 453 different species, from the common to the rare.

The authors emphasize the alert system is pre-diagnostic. It alerts agencies to unusual patterns that may warrant further investigation to determine specific health threats.

The system detected several key events, including large admissions of:

Marine birds along the central and southern California coast in late spring 2016. Post-mortem examinations confirmed they were starving.

Marine birds in April 2017. Domoic acid toxicity was later confirmed as the cause of death.

Invasive Eurasian collared doves in 2016 with encephalitis and kidney disease. Investigations revealed pigeon paramyxovirus-1 as the cause of the event. This was the first detection of the virus emerging in Eurasian collared doves in this region of California.

Rock pigeons in the San Francisco Bay Area in 2017 with an emerging parasite.

Finches in 2016 and 2017 with seasonal conjunctivitis due to infection with Mycoplasma bacteria.

HUMAN CONNECTIONS

Kelly notes that being able to monitor and rapidly detect such events is important for all species, humans included. For example, domoic acid intoxication is caused by harmful algal blooms, which are increasing in coastal and freshwater systems and threaten both wildlife and human health. Another example is West Nile virus, where bird deaths can serve as a sensitive indicator for risk to domestic animals and people.

The alert system is a complementary, inexpensive and efficient tool to add to state wildlife agencies' toolbox of surveillance efforts. It combines machine-learning algorithms, natural language processing, and statistical methods used for classifying cases and establishing thresholds for alerts with the ecology and distribution of wildlife within California, said co-leading author Pranav Pandit, a researcher in the UC Davis One Health Institute and its EpiCenter for Disease Dynamics.

"The wildlife rehabilitation organizations' data is making such valuable contributions," Pandit said. "That's all coming together in this highly adaptable surveillance system."

Credit: 
University of California - Davis

The hidden culprit killing lithium-metal batteries from the inside

image: In this new, false-color image of a lithium-metal test battery produced by Sandia National Laboratories, high-rate charging and recharging red lithium metal greatly distorts the green separator, creating tan reaction byproducts, to the surprise of scientists.

Image: 
Katie Jungjohann, Sandia National Laboratories

ALBUQUERQUE, N.M. -- For decades, scientists have tried to make reliable lithium-metal batteries. These high-performance storage cells hold 50% more energy than their prolific, lithium-ion cousins, but higher failure rates and safety problems like fires and explosions have crippled commercialization efforts. Researchers have hypothesized why the devices fail, but direct evidence has been sparse.

Now, the first nanoscale images ever taken inside intact, lithium-metal coin batteries (also called button cells or watch batteries) challenge prevailing theories and could help make future high-performance batteries, such as for electric vehicles, safer, more powerful and longer lasting.

"We're learning that we should be using separator materials tuned for lithium metal," said battery scientist Katie Harrison, who leads Sandia National Laboratories' team for improving the performance of lithium-metal batteries.

Sandia scientists, in collaboration with Thermo Fisher Scientific Inc., the University of Oregon and Lawrence Berkeley National Laboratory, published the images recently in ACS Energy Letters. The research was funded by Sandia's Laboratory Directed Research and Development program and the Department of Energy.

Internal byproduct builds up, kills batteries

The team repeatedly charged and discharged lithium coin cells with the same high-intensity electric current that electric vehicles need to charge. Some cells went through a few cycles, while others went through more than a hundred cycles. Then, the cells were shipped to Thermo Fisher Scientific in Hillsboro, Oregon, for analysis.

When the team reviewed images of the batteries' insides, they expected to find needle-shaped deposits of lithium spanning the battery. Most battery researchers think that a lithium spike forms after repetitive cycling and that it punches through a plastic separator between the anode and the cathode, forming a bridge that causes a short. But lithium is a soft metal, so scientists have not understood how it could get through the separator.

Harrison's team found a surprising second culprit: a hard buildup formed as a byproduct of the battery's internal chemical reactions. Every time the battery recharged, the byproduct, called solid electrolyte interphase, grew. Capping the lithium, it tore holes in the separator, creating openings for metal deposits to spread and form a short. Together, the lithium deposits and the byproduct were much more destructive than previously believed, acting less like a needle and more like a snowplow.

"The separator is completely shredded," Harrison said, adding that this mechanism has only been observed under fast charging rates needed for electric vehicle technologies, but not slower charging rates.

As Sandia scientists think about how to modify separator materials, Harrison says that further research also will be needed to reduce the formation of byproducts.

Scientists pair lasers with cryogenics to take 'cool' images

Determining cause-of-death for a coin battery is surprisingly difficult. The trouble comes from its stainless-steel casing. The metal shell limits what diagnostics, like X-rays, can see from the outside, while removing parts of the cell for analysis rips apart the battery's layers and distorts whatever evidence might be inside.

"We have different tools that can study different components of a battery, but really we haven't had a tool that can resolve everything in one image," said Katie Jungjohann, a Sandia nanoscale imaging scientist at the Center for Integrated Nanotechnologies. The center is a user facility jointly operated by Sandia and Los Alamos national laboratories.

She and her collaborators used a microscope that has a laser to mill through a battery's outer casing. They paired it with a sample holder that keeps the cell's liquid electrolyte frozen at temperatures between minus 148 and minus 184 degrees Fahrenheit (minus 100 and minus 120 degrees Celsius, respectively). The laser creates an opening just large enough for a narrow electron beam to enter and bounce back onto a detector, delivering a high-resolution image of the battery's internal cross section with enough detail to distinguish the different materials.

The original demonstration instrument, which was the only such tool in the United States at the time, was built and still resides at a Thermo Fisher Scientific laboratory in Oregon. An updated duplicate now resides at Sandia. The tool will be used broadly across Sandia to help solve many materials and failure-analysis problems.

"This is what battery researchers have always wanted to see," Jungjohann said.

Credit: 
DOE/Sandia National Laboratories

Virtual schooling exposes digital challenges for Black families, MU study finds

image: Adaobi Anakwe is a post-doctoral fellow at the University of Missouri School of Health Professions.

Image: 
MU School of Health Professions

COLUMBIA, Mo. -- A new study from the University of Missouri found the unanticipated transitions to virtual schooling due to COVID-19 exposed the lack of digital resources among Black families in the United States, including access to Wi-Fi and technological savviness. The findings help explain the extensive stress virtual schooling caused for many Black families trying to keep their children learning and engaged online while at home during the pandemic.

"What we found was parents and caregivers often felt disempowered in the rapidly changing environment, as they did not necessarily feel equipped with the tools or technological savviness to effectively engage in their children's education the way they felt they needed to," said Adaobi Anakwe, an MU post-doctoral fellow and the study's lead author. "Schools were sending students home with devices for online learning without first ensuring families had reliable, consistent internet access to utilize those devices, and this was a big contributor to parental stress and burnout."

Anakwe and Wilson Majee, an associate professor in the MU School of Health Professions, interviewed parents and primary caregivers of Black families in Missouri with school-aged children to better understand their experiences suddenly shifting to virtual schooling due to COVID-19.

Anakwe explained the sudden shift to virtual schooling highlighted the digital divide that already existed for many Black families, as a lack of access to reliable internet can have long-term negative impacts on learning and health outcomes.

"The COVID-19 vaccine rollout showcased how important technological resources can be for making an appointment online," Anakwe said. "And the sudden shift from in-person health care visits to telehealth highlights the role technology can play in facilitating access to health care as well as education."

Anakwe added that even before the pandemic, Black families were already disproportionately faced with single-parent households, disparities in income and unequal access to transportation, housing, healthy foods and recreational facilities.

"We already have a cafeteria menu of social determinants of health that impact Black and minority populations," Anakwe said. "We need to be proactive to prevent the digital divide from becoming another issue that gets added on to an already very long list of challenges Black families deal with."

The COVID-19 pandemic also caused an increase in technology use among students, causing some Black parents to worry about the potential impact on their children's mental health.

"Before the pandemic, parents were tasked with minimizing screen time for their kids and ensuring they spent enough time outside engaged in physical activity," Anakwe said. "Then all of a sudden, parents were forced to encourage their children to use technology to stay engaged in their school work while at home. As COVID-19 lockdowns are starting to end, it will be interesting to see how the messaging around screen time evolves."

Majee said MU Extension and the University of Missouri System Broadband Initiative have helped increase access to broadband internet for rural Missourians, but more collaborative partnerships among community leaders, schools, local governments and families are needed to assist underprivileged Black families.

"Technology is becoming increasingly necessary for success in our lives, so this research can help us better understand the technological challenges facing Black families," Majee said. "Our overall goal is to improve the health of Black families by helping our community members who are most disadvantaged - it's a labor of love."

Credit: 
University of Missouri-Columbia

Role of subnuclear NSrp70 in immunity-studied at Gwangju Institute of Science & Technology

image: Importance of a protein called NSrp70, which was previously discovered in T cell subnuclear spaces, in regulating the maturation of T cells

Image: 
Gwangju Institute of Science & Technology

T lymphocytes, or T cells, are immune cells with diverse roles in building the body's immunity. How does one particular cell type fight against a host of different pathogens? The key to this adaptability is in alternative splicing, wherein the cell produces multiple forms of proteins for identifying different types of invading viruses and microbes, as well as destroying cancer cells. So, it is not surprising that finding ways to improve the production of T cells with enhanced pathogen recognition capacity is an actively researched area of modern science.

In 2011, scientists from the Gwangju Institute of Science and Technology (GIST) in Korea discovered a protein called NSrp70, which is abundant in motile T cells. Located in specific regions inside the cell's nucleus called nuclear speckles, NSrp70, which is short for nuclear speckle-related protein 70, is a gene regulator. Interestingly, nuclear speckles play an important role in protein production by housing mRNA splicing regulators that cut and join pre-messenger RNA fragments to produce the final messenger RNAs based on the proteins that are produced. With this knowledge, the scientists have been speculating about the possible role NSrp70 plays in the alternative splicing involved in T cell maturation and development.

Now, in a new study made available online on 25 May 2021 and published in Volume 49 Issue 10 of the journal Nucleic Acids Research, scientists from GIST led by Professor Chang-Duk Jun report their findings on the role of this protein and its encoding gene within a physiological environment. Explaining their motivation for the research, Prof. Jun says, "Ever since we discovered and presented NSrp70 to the academic world, we have been exploring its functions and mechanisms of action."

In experiments with mice, the scientists were able to analyze the function of the protein using a technique known as conditional gene knockout, whereby they inactivated the gene coding for NSrp70 and produced cell samples that lacked the protein. By comparing these cell samples with regular cells, they determined the effects caused by the absence of the protein.

They report that NSrp70 is expressed very early in the cell cycle and plays an important role in T cell development. Specifically, the absence of the protein induced uncontrolled cell growth and death in double positive thymocytes that are T cell precursors. This impeded their progression to single positive thymocytes stymying the formation of mature T cells. NSrp70-deficient mice had a noticeably reduced lymphocyte count in peripheral tissues, and, thus, resulted in unchecked tumor growth, further attesting to the role NSrp70 plays in cancer development. Prof. Jun sums up the findings emphatically, "Our study revealed that NSrp70 is an important regulator of T cell proliferation. This finding can help us mass-produce specific T cells for cell therapy or use mass-produced T cells to inhibit the proliferation of cancer cells through gene therapy."

The world welcomes such critical findings on nuclear speckles and constituent proteins with eager anticipation!

Credit: 
GIST (Gwangju Institute of Science and Technology)

New method makes vital fertilizer element in a more sustainable way

image: The electrocatalytic reaction between these building blocks could make urea production much more energy efficient.

Image: 
The University of Texas at Austin

Urea is a critical element found in everything from fertilizers to skin care products. Large-scale production of urea, which is naturally a product of human urine, is a massive undertaking, making up about 2% of global energy use and emissions today.

For decades, scientists and engineers have sought to make this process more energy efficient as demand for fertilizer grows with increased population. An international research team that includes scientists and engineers from The University of Texas at Austin has devised a new method for making urea that is more environmentally friendly than today's process and produces enough to be competitive with energy-intensive industrial methods.

Making urea today involves a two-step thermal process that requires high levels of heat and pressure under controlled harsh environments. But this new process requires just one step and relies on a concept called electrocatalysis that uses electricity -- and potentially sunlight -- to trigger chemical reactions in a solution at room temperature in ambient conditions.

"Around the world we need to lower emissions. That's why we want to develop these more sustainable pathways to produce urea using electrocatalysis instead of this energy-intensive two-step process," said Guihua Yu, an associate professor of materials science in the Cockrell School of Engineering's Walker Department of Mechanical Engineering who co-lead the team that published a new milestone paper about the process in Nature Sustainability.

Today, synthetic urea is produced primarily via the Haber-Bosch method -- which is known as one of the greatest inventions of the 20th century because it enabled mass production of fertilizer and helped increase global food supply. It combines nitrogen and hydrogen to make ammonia, which then bonds with carbon dioxide to make urea. This two-step process requires heating to 400 or 500 degrees Fahrenheit to perform the reaction, using huge amounts of energy and producing significant emissions along the way.

Producing urea through electrocatalysis is an alternative process that is more sustainable and energy efficient. However, this method historically has not produced enough to make it viable. It created too many byproducts and required much energy to break the bonds of the molecular building blocks to trigger the reaction.

Finding the right elements or catalysts to create an efficient chemical reaction was the primary challenge. The UT team used nitrate, instead of the typical nitrogen, to bond with carbon dioxide. And the catalyst solution is composed of indium hydroxide nanomaterials.

This highly efficient nanomaterial electrocatalyst has "high selectivity," Yu said, meaning it produces only what the researchers want it to produce, not a bunch of byproducts. And it creates a higher yield of urea than previous attempts using electrocatalysis.

"It takes much less energy to break the bonds of nitrate, compared to nitrogen, and that helps produce a lot higher yield of urea," Yu said.

Yu sees this formula as applicable to large- and small-scale uses. Electrocatalytic devices could be operated by individuals and sold to individual farms so they could generate their own urea for soil. And the hope is to provide alternative solutions to large-scale industrial processes to reduce energy use, which can play a role in a more sustainable future as the population and demand for urea will surely grow.

Next steps in this process involve improving further the yield and selectivity, as well as a prototype device that can scale up production. And the research team is trying to find a way to power the process using solar energy rather than direct electricity.

Credit: 
University of Texas at Austin

Roadless forests see more blazes and greater severity, but fire resilience is the result

image: Beachie Creek fire 2020 when it was still small.

Image: 
(Photo by James Johnston, OSU College of Forestry)

CORVALLIS, Ore. - Roadless national forests in the American West burn more often and at a slightly higher severity than national forests with roads, but the end result for the roadless forests is greater fire resilience, Oregon State University researchers say.

The findings, published today in Environmental Research Letters, provide a key piece of the puzzle for a region trying to develop better approaches to living with fire in the wake of a 2020 fire season that brought historically disastrous blazes.

Limiting smoke exposure and reducing risk to water supplies, habitat and human infrastructure from huge, uncontrolled fires are important goals of policymakers, said James Johnston, a researcher in the OSU College of Forestry and the study's leader.

Mechanical fuel treatments - piling brush, thinning dense stands of trees, etc. - are a common tool for meeting those goals, but more than half of all fires, including most of the largest ones, burn mainly in roadless areas, where mechanical treatments are usually prohibited.

"The extent of fire where management options are limited makes clear the need to adapt to, rather than overcome, fire," he said.

Differences in fire extent and fire escape - a fire getting beyond the area you think it should stay contained in - are strongly associated with roadless vs. non-roadless management, Johnston said. But the real drivers of fire severity - i.e. tree mortality - are differences in environment and not land use designations.

Trees growing in sites at higher elevations with greater moisture availability and lower temperatures - which describes most of the roadless sites - are generally less fire tolerant than species found in drier, lower-elevation landscapes.

Created in 1905, the U.S. Forest Service oversees nearly 190 million acres of national forests, most of it in the West. The area managed by the USFS makes up one-fifth of all forestland in the United States and 1.5% globally.

Historically, federal legislation typically required the agency to emphasize timber cutting, but the Wilderness Act of 1964 called for the creation of areas where natural conditions would be preserved.

"The act also required the Forest Service to inventory all of its roadless areas not designated as wilderness, pending future action by Congress," Johnston said. "Any of those roadless areas not released for development in the 1970s and '80s ended up becoming an unofficial extension of the wilderness system, and then in 2001, the Roadless Area Conservation Rule generally prohibited building roads and harvesting timber in those areas."

That created two distinct management regimes: an active one featuring road-filled landscapes and a history of recreational development and timber harvesting, and another with no roads, no development and little or no harvesting history. The breakdown is roughly 50-50.

"Human influences are largely absent in roadless areas, the management of which is largely a matter of decisions about how to deal with natural disturbances like wildfire," Johnston said.

Before 1910, frequent low-severity surface fires played a key role in maintaining forests. In the decades since, the comparative lack of fire that resulted from federal policy - in concert with grazing, logging and land-use changes - have caused major structural shifts in older forests as shade-tolerant and fire-intolerant species have moved in.

The policy of fire suppression traces its roots to the Great Fire of 1910, which killed 87 people, destroyed several towns and burned an area roughly the size of Connecticut. The blaze consumed 3 million acres of forest in Idaho, Montana, Washington and British Columbia.

"Wildfire is an important disturbance process that shapes the structure, composition and function of forests, and a better understanding of how passive versus active management relates to fire patterns is critical for managers trying to meet new objectives to restore forests to their natural fire regime," Johnston said. "Over the last three decades, roughly one-third of the roadless landscape experienced fire, while less than one-fifth of the 'roaded' lands did."

That's despite the fact that roadless areas had far fewer ignition events and are generally in regions that are cooler and moister.

"Most of the largest fires that have burned on national forestland in recent years began in roadless areas," said study co-author Jack Kilbride, a Ph.D. student in OSU's College of Earth, Ocean and Atmospheric Sciences. "But evidence suggests that the greater extent of fire in roadless areas has potential to make those landscapes more resilient in the face of climate change. This study really shows the usefulness of satellite data for being able to characterize how fire patterns differ as a function of management."

The legacy of fire suppression includes increased forest density, shifts in species composition and loss of resiliency to fire, drought and insects, the researchers say. But a number of recent studies have shown that forests in wilderness and other roadless areas that have experienced multiple fires are less likely to experience stand-replacing fire and are getting back to the structure and composition they featured prior to white settlement.

"Mechanical thinning, prescribed fire and wildland fire will continue to be used as tools on the 'roaded' landscape," Johnston said. "And without major policy changes, wildland fire will continue to be the primary weapon available in roadless areas. Working together, forest managers and scientists can determine which management objectives are seeing progress, and how much."

Credit: 
Oregon State University

Molecular bridge mediates inhibitory synapse specificity in the cortex

image: Authors Yasufumi Hayano and Hiroki Taniguchi

Image: 
Courtesy of the Max Planck Florida Institute for Neuroscience

With its breathtaking views and striking stature, the Golden Gate bridge certainly deserves its title as one of the modern wonders of the world. Its elegant art deco style and iconic towers offer visitors a once-in-a-lifetime opportunity for astounding photographs. Stretching for almost 2 miles, the Golden Gate serves as a critical gateway, facilitating the exchange of ideas, commodities, and people.

Though not to the same grandiose scale, our brains have similar gateways to connect neurons. These tiny compartments, called synapses, enable the dynamic exchange of information and the formation of neural circuits. To build these circuits, developing neurons must first follow specific guidance cues, traveling across the brain until finding their proper partners. This process is largely important for the cerebral cortex which consists of six functionally and anatomically distinct layers. Though the cortex has been extensively studied, not much is known about the precise molecular mechanisms driving synapse specificity within its layers. This is especially true for a specialized class of neurons called inhibitory interneurons (INs), which typically make local connections with just one or two layers. Uncovering the molecules at play would further the understanding of cortical inhibitory circuit formation.

In a recent publication in the journal of Science Advances, Max Planck Florida's Taniguchi lab has shed light on a new mechanism for inhibitory synapse specificity in the cortex. Identifying a novel role for the cell adhesion molecule IgSF11, MPFI scientists have discovered that the protein mediates layer-specific synaptic targeting in cortical Chandelier Cells (ChCs).

"Our lab specializes in the study of cortical interneurons and inhibitory circuit formation," describes Dr. Hiroki Taniguchi, Ph.D. and Research Group Leader at Max Planck Florida. "Chandelier Cells, one of our favorite interneuron subtypes, have been shown to express unique genetic markers and innervate only certain layers within the cortex. (ChCs critically control spike generation in cortical principal neurons and have been implicated in the pathology of brain disorders such as schizophrenia and epilepsy.) We decided that this model cell-type would be the perfect place to begin our search for molecules that confer layer-specific synapse matching."

MPFI scientists started their investigation using single-cell RNA sequencing to genetically screen INs for genes unique to an individual subtype. They found a select pool of genes in an intriguing category known as cell adhesion molecules, or CAMs. One CAM in particular, IgSF11, was highly enriched in ChCs compared to other IN subtypes.

"Our genetic screening of INs is where we first came across IgSF11," explains Yasufumi Hayano, Ph.D., first author of the publication and research scientist in the Taniguchi Lab. "We were looking for subtype-specific genes that encode cell surface proteins, thinking that those expressed on the outside of neurons would be the perfect candidate to mediate a synapse-specific interaction."

CAMs comprise a diverse group of structural proteins. Often thought of as a biological glue, CAMs are expressed on the outside of neurons and interact in large complexes, facilitating cell-to-cell interactions. The bridge-like complex they form offers stability for newly formed synapses and aids in cell adhesion and communication. One category of CAMs, called homophilic CAMs, only interacts with other CAMs that are identical to themselves and theorized to have the possibility of mediating specificity of synapse formation.

After identifying IgSF11 as a homophilic CAM, the MPFI team looked for IgSF11 expression in neurons from the upper half of layer 2/3 of the cortex which ChCs innervate, reasoning that expression would need to occur on both sides for a homophilic CAM interaction. Using fluorescent in situ hybridization (FISH) researchers found robust expression of IgSF11 in both ChCs as well as in target neurons residing within layer 2/3 of the cortex but not in other layers; providing strong evidence that the IgSF11 interaction is important in ChC synapse specificity.

Next, the Taniguchi lab assessed the functional role of IgSF11 in the formation of ChC synapses by removing IgSF11 from the brain and examining changes. In order to parse out if IgSF11 was functionally necessary on only ChCs or both ChCs and target cortical neurons, the team had to develop a strategy that allowed the selective removal of IgSF11. To accomplish this, MPFI scientists generated IgSF11 KO mice and transplanted fluorescently identified KO ChCs into wild-type (wt) host animals. KO ChCs displayed a significant reduction in both the size and number of synaptic boutons. Corroborating the hypothesis that IgSF11 confers its specificity through homophilic interaction, transplanting wt ChCs into the brains of IgSF11 KO mice resulted in the same reduction. Taken together, IgSF11 seems to be strongly implicated in ChC synaptic bouton development and morphological differentiation.

Collaborations with MPFI's electron microscopy core and Kwon lab delved further into the functional consequences of KO IgSF11. Ultrastructure analysis using high magnification EM revealed that the few remaining synaptic boutons in KO ChCs did not differentiate properly and showed deficits in synaptic transmission. Supporting this data, optogenetics-assisted electrophysiology of IgSF11 KO mice demonstrated additional deficits in synaptic transmission.

"One challenge working with Chandelier cells is that they are difficult to genetically manipulate using traditional methods," explains Dr. Hayano. "To overcome this, we devised a new viral-based strategy using adeno associated virus to deliver IgSF11, a difficult-to-express protein, to cells of interest."

The MPFI team used their AAV viral strategy to investigate whether IgSF11 expressed in neurons from different cortical layers other than layer 2/3 could artificially induce the formation of synapses with ChCs. Transducing neurons in layer V with IgSF11, they discovered numerous ectopic synapses formed between these cells and chandelier cells; a phenomenon that would not occur under normal circumstances.

"IgSF11 is the very first cell adhesion molecule identified that directly mediates interneuron subtype, layer-specific formation of synapses in the cortex," notes Dr. Taniguchi. "Further elucidating the molecular mechanisms surrounding inhibitory circuit assembly may reveal a similar pattern in other distinct interneuron subtypes and help to unravel how inhibitory circuits form. Our work may provide a useful entry point into understanding the etiology of neurodevelopmental disorders caused by circuit deficit in unique interneuron subtypes."

Credit: 
Max Planck Florida Institute for Neuroscience

Hard to swallow: Coral cells seen engulfing algae for first time

image: As marine heatwaves become more commonplace, coral reefs are expelling their microscopic and colorful algae and bleaching white. Scott Reef, Australia, April 2016.

Image: 
Australian Institute of Marine Science

For the first time, scientists have seen stony coral cells engulf dinoflagellates - single-celled, photosynthetic algae that are crucial for keeping coral alive

The researchers used a cell line called IVB5, which contains endoderm-like cells cultured from the stony coral, Acropora tenuis

Around 40% of coral cells incorporated the algae in around 30 minutes and remained healthy for one month

The research is a step towards understanding the partnership between coral and dinoflagellates and could shed light on how coral bleaching occurs

In a world-first, scientists in Japan have observed individual stony coral cells engulfing single-celled, photosynthetic algae.

The microscopic algae, known as dinoflagellates, were engulfed by cells cultured from the stony coral, Acropora tenuis, the scientists reported in the journal Frontiers in Marine Science.

"Dinoflagellates are crucial for keeping coral healthy and alive," said Professor Noriyuki Satoh, senior author of the study and head of the Marine Genomics Unit at the Okinawa Institute of Science and Technology Graduate University. "Coral cells take up the algae and provide them with shelter and the building blocks for photosynthesis. In return, the algae provide the corals with nutrients that they synthesize."

However, in recent decades, this essential relationship has been placed under strain. Driven by pollution, acidification and rising ocean temperatures, stressed coral cells are expelling the microscopic and colorful algae in mass bleaching events, resulting in huge swathes of dead, white reefs.

Stony coral from the Acroporidae family - the most common type of coral found within tropical and subtropical reefs - are particularly susceptible to these bleaching events. These fast-growing corals lay down calcium carbonate skeletons and therefore play a key role in building coral reefs.

"For coral reef conservation, it's vital for us to fully understand the partnership between stony coral and the algae that live inside these animals, at the level of a single cell," explained co-first Professor Kaz Kawamura from Kochi University. "But until recently, this was very hard to achieve."

Coral cells are notoriously difficult to culture, so previously scientists had to rely on experimental systems of other closely related marine creatures, like sea anemones, to study the mechanism of how the dinoflagellates enter and leave cells.

It wasn't until April 2021 that the research team made a major leap forward, reporting in Marine Biotechnology that they had successfully cultured different cell lines from larvae of the stony coral, Acropora tenuis, in petri dishes.

For this study, the scientists focused on one coral cell line called IVB5. Many of the cells in this line have similar properties to endodermal cells, in terms of their form, behavior and gene activity. Importantly, in whole coral organisms, it is the endodermal cells that engulf the algae.

The scientists added the dinoflagellate, Breviolum minutum, to a petri dish containing the IVB5 coral cells.

Around 40% of the coral cells in the culture quickly formed long, finger-like projections that reached out to contact the dinoflagellates. The algae were then "swallowed" up, in a process taking around 30 minutes.

"It was amazing to see - it was almost a dream!" said Prof. Satoh.

Over the following couple of days, the algae inside the cell were either broken down into fragments or were successfully enclosed into membrane-bound sacs, called vacuoles, within the cells. For the researchers, this hints at how the relationship possibly started millennia ago.

"It may be that originally, the ancestors of coral engulfed these algae and broke them down for food. But then over time, they evolved to use the algae for photosynthesis instead," co-first author, Dr. Satoko Sekida from Kochi University suggested.

The researchers are now using electron microscopes to gain more detailed images of how the coral cells engulf the dinoflagellates. They are also working on genetic experiments to pinpoint which coral genes are involved.

At this stage, the coral cells containing the algae live for around a month before dying. In the near future, the team hope to achieve a stable culture where both the coral cells and dinoflagellates can reproduce together.

"This would be very exciting as then we can ask new questions, like how the corals react when placed under stress," said Prof. Satoh. "This could give us a more complete understanding of how bleaching occurs, and how we can mitigate it."

Credit: 
Okinawa Institute of Science and Technology (OIST) Graduate University

Quantum physics helps destroy cancer cells

image: When X-rays are irradiated onto tumor tissue containing iodine-carrying nanoparticles, the iodine releases electrons that break DNA and kill the cancer cells.

Image: 
Mindy Takamiya/Kyoto University iCeMS

Cancer cell death is triggered within three days when X-rays are shone onto tumor tissue containing iodine-carrying nanoparticles. The iodine releases electrons that break the tumor's DNA, leading to cell death. The findings, by scientists at Kyoto University's Institute for Integrated Cell-Material Sciences (iCeMS) and colleagues in Japan and the US, were published in the journal Scientific Reports.

"Exposing a metal to light leads to the release of electrons, a phenomenon called the photoelectric effect. An explanation of this phenomenon by Albert Einstein in 1905 heralded the birth of quantum physics," says iCeMS molecular biologist Fuyuhiko Tamanoi, who led the study. "Our research provides evidence that suggests it is possible to reproduce this effect inside cancer cells."

A long-standing problem with cancer radiation therapy is that it is not effective at the center of tumors where oxygen levels are low due to the lack of blood vessels penetrating deeply into the tissue. X-ray irradiation needs oxygen to generate DNA-damaging reactive oxygen when the rays hit molecules inside the cell.

Tamanoi, together with Kotaro Matsumoto and colleagues have been trying to overcome this issue by finding more direct ways to damage cancer DNA. In earlier work, they showed that gadolinium-loaded nanoparticles could kill cancer cells when irradiated with 50.25 kiloelectron volts of synchrotron-generated X-rays.

In the current study, they designed porous, iodine-carrying organosilica nanoparticles. Iodine is cheaper than gadolinium and releases electrons at lower energy levels.

The researchers dispersed their nanoparticles through tumor spheroids, 3D tissue containing multiple cancer cells. Irradiating the spheroids for 30 minutes with 33.2 keV of X-rays led to their complete destruction within three days. By systematically changing energy levels, they were able to demonstrate that the optimum effect of tumor destruction occurs with 33.2 keV X-ray.

Further analyses showed that the nanoparticles were taken up by the tumor cells, localizing just outside their nuclei. Shining just the right amount of X-ray energy onto the tissue prompted iodine to release electrons, which then caused double-strand breaks in the nuclear DNA, triggering cell death.

"Our study represents an important example of employing a quantum physics phenomenon inside a cancer cell," says Matsumoto. "It appears that a cloud of low-energy electrons is generated close to DNA, causing double strand breaks that are difficult to repair, eventually leading to programmed cell death."

The team next wants to understand how electrons are released from iodine atoms when they are exposed to X-rays. They are also working on placing iodine on DNA rather than near it to increase efficacy, and to test the nanoparticles on mouse models of cancer.

Key contributors to this work are Yuya Higashi (iCeMS), Hiroyuki Saitoh (QST) and Toshiki Tajima (UC Irvine, Dept. of Physics & Astronomy) in addition to Tamanoi and Matsumoto.

Credit: 
Kyoto University

Liquid metal sensors and AI could help prosthetic hands to 'feel'

video: Researchers used individual fingertips fitted with stretchable tactile sensors with liquid metal on a prosthesis to distinguish between different speeds of a sliding motion along different textured surfaces. The four prosthetic hand fingertips simultaneously distinguished between complex, multi-textured surfaces - demonstrating a new form of hierarchical intelligence.

Image: 
Florida Atlantic University/College of Engineering and Computer Science

Each fingertip has more than 3,000 touch receptors, which largely respond to pressure. Humans rely heavily on sensation in their fingertips when manipulating an object. The lack of this sensation presents a unique challenge for individuals with upper limb amputations. While there are several high-tech, dexterous prosthetics available today - they all lack the sensation of "touch." The absence of this sensory feedback results in objects inadvertently being dropped or crushed by a prosthetic hand.

To enable a more natural feeling prosthetic hand interface, researchers from Florida Atlantic University's College of Engineering and Computer Science and collaborators are the first to incorporate stretchable tactile sensors using liquid metal on the fingertips of a prosthetic hand. Encapsulated within silicone-based elastomers, this technology provides key advantages over traditional sensors, including high conductivity, compliance, flexibility and stretchability. This hierarchical multi-finger tactile sensation integration could provide a higher level of intelligence for artificial hands.

For the study, published in the journal Sensors, researchers used individual fingertips on the prosthesis to distinguish between different speeds of a sliding motion along different textured surfaces. The four different textures had one variable parameter: the distance between the ridges. To detect the textures and speeds, researchers trained four machine learning algorithms. For each of the ten surfaces, 20 trials were collected to test the ability of the machine learning algorithms to distinguish between the ten different complex surfaces comprised of randomly generated permutations of four different textures.

Results showed that the integration of tactile information from liquid metal sensors on four prosthetic hand fingertips simultaneously distinguished between complex, multi-textured surfaces - demonstrating a new form of hierarchical intelligence. The machine learning algorithms were able to distinguish between all the speeds with each finger with high accuracy. This new technology could improve the control of prosthetic hands and provide haptic feedback, more commonly known as the experience of touch, for amputees to reconnect a previously severed sense of touch.

"Significant research has been done on tactile sensors for artificial hands, but there is still a need for advances in lightweight, low-cost, robust multimodal tactile sensors," said Erik Engeberg, Ph.D., senior author, an associate professor in the Department of Ocean and Mechanical Engineering and a member of the FAU Stiles-Nicholson Brain Institute and the FAU Institute for Sensing and Embedded Network Systems Engineering (I-SENSE), who conducted the study with first author and Ph.D. student Moaed A. Abd. "The tactile information from all the individual fingertips in our study provided the foundation for a higher hand-level of perception enabling the distinction between ten complex, multi-textured surfaces that would not have been possible using purely local information from an individual fingertip. We believe that these tactile details could be useful in the future to afford a more realistic experience for prosthetic hand users through an advanced haptic display, which could enrich the amputee-prosthesis interface and prevent amputees from abandoning their prosthetic hand."

Researchers compared four different machine learning algorithms for their successful classification capabilities: K-nearest neighbor (KNN), support vector machine (SVM), random forest (RF), and neural network (NN). The time-frequency features of the liquid metal sensors were extracted to train and test the machine learning algorithms. The NN generally performed the best at the speed and texture detection with a single finger and had a 99.2 percent accuracy to distinguish between ten different multi-textured surfaces using four liquid metal sensors from four fingers simultaneously.

"The loss of an upper limb can be a daunting challenge for an individual who is trying to seamlessly engage in regular activities," said Stella Batalama, Ph.D., dean, College of Engineering and Computer Science. "Although advances in prosthetic limbs have been beneficial and allow amputees to better perform their daily duties, they do not provide them with sensory information such as touch. They also don't enable them to control the prosthetic limb naturally with their minds. With this latest technology from our research team, we are one step closer to providing people all over the world with a more natural prosthetic device that can 'feel' and respond to its environment."

Credit: 
Florida Atlantic University

New mechanism of superconductivity discovered in graphene

image: A hybrid system consisting of an electron gas in graphene (top layer) separated from a two-dimensional Bose-Einstein condensate, represented by indirect excitons (blue and red layers). The electrons in the graphene and the excitons are coupled by the Coulomb force.

Image: 
Institute for Basic Science

Superconductivity is a physical phenomenon where the electrical resistance of a material drops to zero under a certain critical temperature. Bardeen-Cooper-Schrieffer (BCS) theory is a well-established explanation that describes superconductivity in most materials. It states that Cooper pairs of electrons are formed in the lattice under sufficiently low temperature and that BCS superconductivity arises from their condensation. While graphene itself is an excellent conductor of electricity, it does not exhibit BCS superconductivity due to the suppression of electron-phonon interactions. This is also the reason that most 'good' conductors such as gold and copper are 'bad' superconductors.

Researchers at the Center for Theoretical Physics of Complex Systems (PCS), within the Institute for Basic Science (IBS, South Korea) have reported on a novel alternative mechanism to achieve superconductivity in graphene. They achieved this feat by proposing a hybrid system consisting of graphene and 2D Bose-Einstein condensate (BEC). This research is published in the journal 2D Materials.

Along with superconductivity, BEC is another phenomenon that arises at low temperatures. It is the fifth state of matter first predicted by Einstein in 1924. The formation of BEC occurs when low-energy atoms clump together and enter the same energy state, and it is an area that is widely studied in condensed matter physics. A hybrid Bose-Fermi system essentially represents a layer of electrons interacting with a layer of bosons, such as indirect excitons, exciton-polaritons, etc. The interaction between Bose and Fermi particles leads to various novel fascinating phenomena, which piques interests from both the fundamental and application-oriented perspectives.

In this work, the researchers report a new mechanism of superconductivity in graphene, which arises due to interactions between electrons and "bogolons", rather than phonons as in typical BCS systems. Bogolons, or Bogoliubov quasiparticles, are excitation within BEC which has some characteristics of a particle. In certain ranges of parameters, this mechanism permits the critical temperature for superconductivity up to 70 Kelvin within graphene. The researchers also developed a new microscopic BCS theory which focuses specifically on the novel hybrid graphene-based system. Their proposed model also predicts that superconducting properties can be enhanced with temperature, resulting in the non-monotonous temperature dependence of the superconducting gap.

Furthermore, the research showed that the Dirac dispersion of graphene is preserved in this bogolon-mediated scheme. This indicates that this superconducting mechanism involves electrons with relativistic dispersion -- a phenomenon that is not so well-explored in condensed matter physics.

"This work sheds light on an alternative way to achieve high-temperature superconductivity. Meanwhile, by controlling the properties of a condensate, we can tune the superconductivity of graphene. This suggests another channel to control the superconductor devices in the future.", explains Ivan Savenko, the leader of the Light-Matter Interaction in Nanostructures (LUMIN) team at the PCS IBS.

Credit: 
Institute for Basic Science

Encrypting photos on the cloud to keep them private

New York, NY--July 13, 2021--The past decade has witnessed scandal after scandal over private images maliciously or accidentally made public. A new study from computer scientists at Columbia Engineering reveals what may be the first way to encrypt personal images on popular cloud photo services, such as those from Google, Apple, Flickr and others, all without requiring any changes to -- or trust in -- those services.

Smartphones now make it easy for virtually everyone to snap photos, with market research firm InfoTrends estimating that people now take more than a trillion photos each year. The limited amount of data that smartphones hold, and the way in which they are vulnerable to accidental loss and damage, lead many users to store their images online via cloud photo services. Google Photos is especially popular, with more than a billion users.

However, these online photo collections are not just valuable to their owners, but to attackers seeking to unearth a gold mine of personal data, as the case of the 2014 celebrity nude photo hacks made clear. Unfortunately, security measures such as passwords and two-factor authentication may not be enough to protect these images anymore, as the online services storing these photos can themselves sometimes be the problem.

"There are many cases of employees at online services abusing their insider access to user data, like SnapChat employees looking at people's private photos," said John S. Koh, the lead author of the paper, who just finished his PhD with professors of computer science Jason Nieh and Steven M. Bellovin. "There have even been bugs that reveal random users' data to other users, which actually happened with a bug in Google Photos that revealed users' private videos to other entirely random users."

A potential solution to this problem would be to encrypt the photos so no one but the proper users can view them. However, cloud photo services are currently not compatible with existing encryption techniques. For example, Google Photos compresses uploaded files to reduce their sizes, but this would corrupt encrypted images, rendering them garbage.

Even if compression worked on encrypted images, mobile users of cloud photo services typically expect to have a way to quickly browse through identifiable photo thumbnails, something not possible with any existing photo encryption schemes. A number of third-party photo services do promise image encryption and secure photo hosting, but these all require users to abandon existing widely used services such as Google Photos.

Now Columbia Engineering researchers have created a way for mobile users to enjoy popular cloud photo services while protecting their photos. The system, dubbed Easy Secure Photos (ESP), encrypts photos uploaded to cloud services so that attackers -- or the cloud services themselves -- cannot decipher them. At the same time, users can visually browse and display these images as if they weren't encrypted. They presented their study, "Encrypted Cloud Photo Storage Using Google Photos," at MobiSys 2021, the 19th ACM International Conference on Mobile Systems, Applications, and Services, on June 30, 2021.

"Even if your account is hacked, attackers can't get your photos because they are encrypted," said Jason Nieh, professor of computer science and co-director of the Software Systems Laboratory.

ESP employs an image encryption algorithm whose resulting files can be compressed and still get recognized as images, albeit ones that look like black and white static to anyone except authorized users. In addition, ESP works for both lossy and lossless image formats such as JPEG and PNG, and is efficient enough for use on mobile devices. Encrypting each image results in three black-and-white files, each one encoding details about the original image's red, green, or blue data.

Moreover, ESP creates and uploads encrypted thumbnail images to cloud photo services. Authorized users can quickly and easily browse thumbnail galleries using image browsers that incorporate ESP.

"Our system adds an extra layer of protection beyond your password-based account security," said Koh, who designed and implemented ESP. "The goal is to make it so that only your devices can see your sensitive photos, and no one else unless you specifically share it with them."

The researchers wanted to make sure that each user could use multiple devices to access their online photos if desired. The problem is the same digital code or "key" used to encrypt a photo has to be the same one used to decrypt the image, "but if the key is on one device, how do you get it to another?" Nieh said. "Lots of work has shown that users do not understand keys and requiring them to move them around from one device to another is a recipe for disaster, either because the scheme is too complicated for users to use, or because they copy the key the wrong way and inadvertently give everyone access to their encrypted data."

The computer scientists developed an easy-to-use way for users to manage these keys that eliminates the need for users to know or care about keys. All a user has to do in order to help a new device access ESP-encrypted photos is to verify it with another device on which they have already installed and logged into an ESP-enabled app. This makes it possible "for multiple trusted devices to still view encrypted photos," Nieh said.

"The need to handle keys, and handle them properly, has been the downfall of almost every other encryption system," Bellovin said.

The researchers implemented ESP in Simple Gallery, a popular photo gallery app on Android with millions of users. It could encrypt images from Google Photos, Flickr and Imgur without changes needed to any of these cloud photo services, and led to only modest increases in upload and download times.

"We are experiencing the beginning of a major technological boom where even average users move towards moving all their data into the cloud. This comes with great privacy concerns that have only recently started rearing their ugly heads, such as the increasing number of discovered cases of cloud service employees looking at private user data," Koh said. "Users should have an option to protect their data that they think is really important in these popular services, and we explore just one practical solution for this."

A number of companies have expressed interest in the new system. "We have a working implementation that we are releasing to developers and other researchers, but not yet to the general public," Koh said.

Credit: 
Columbia University School of Engineering and Applied Science