Tech

A new method measures the integration or segregation of immigrants based on their tweets

An international team led by researchers from the Spanish National Research Council (CSIC) has developed a method to measure the integration or segregation of immigrants based on the messages they write on the social network, Twitter.

In the work, which is published in the journal PLOS ONE, a method was developed to use Twitter data to analyse the degree of spatial segregation of immigrant communities. "The users' communities of origin are determined by the language in which the tweets are posted, establishing an 'idiomatic algebra' to assign the most likely community to which a tweet belongs," explains the study's director, José Javier Ramasco, CSIC researcher at the Institute for Cross-Disciplinary Physics and Complex Systems, in Mallorca, Spain.

"If all the messages are in the local language, then the user is considered to be a local resident. If, on the other hand, some messages are in the language of an immigrant community, it can be assumed that the user knows that language and belongs to that community," he adds.

The language used, together with the location of the messages, make it possible to find the typical residential areas of the different communities and to study whether they are more or less concentrated in those areas than the local population. "This method has allowed us to analyse immigrant communities in 53 of the world's largest cities. In each one of them we can define a metric that measures the spatial integration capacity of the immigrants living there," Ramasco explains.

By applying this metric, cities can be divided into three categories: those with high integration capacity, those with few immigrant communities- or those that are highly segregated from a spatial point of view- and an intermediate category between both extremes, explains Ramasco. "In the first group (high integration) we find cities such as London, San Francisco, Tokyo and Los Angeles, while at the other extreme (low integration) we see others such as Detroit, Miami, Toronto and Amsterdam," he explains.

In addition to considering cities, you can analyse how different cultures - characterised by language - are integrated within the countries where these cities are located. The best integration is found among nearby cultures, for example, Latin-based language speakers (speakers of Portuguese and Italian) in South American Spanish-speaking countries, or those from European countries within the United Kingdom. The cases of greater segregation occur between extremely different cultures.

This method opens a new avenue, offering a new source of data to analyse the segregation or spatial integration of immigrants' residences. The online data, which is intended for other purposes, is immense and constantly updated. These studies offer a significantly reduced-cost form of access to near real-time information, with study areas on a global scale. "We hope, then, that this first work will open up the possibility for future use of this data to study integration. We also hope that it may be a valuable complement beyond the scientific community for managers and public authorities who are in charge of immigration," concludes Ramasco.

Credit: 
Spanish National Research Council (CSIC)

Little evidence for any direct impact of national cancer policies on short-term survival in England

A study published by The BMJ today finds little evidence for any direct impact of national cancer policy initiatives implemented since 2000 on short term cancer survival in England.

And no evidence was found for a reduction in social and economic inequalities ("deprivation gap") in cancer survival since the mid-1990s. The researchers say these findings "emphasise that socioeconomic inequalities in survival remain a major public health problem for a healthcare system founded on equity."

Cancer survival in England has been improving steadily since the 1970s, but still lags behind that seen in comparable countries in Europe. Differences in cancer survival between less and more deprived patients also persist for most types of cancer.

The NHS Cancer Plan was launched in 2000 in a bid to tackle these inequalities and improve cancer survival to levels comparable with the rest of Europe. Since then, other strategies have been introduced, but little is known about their impact at a national level.

So researchers in the Cancer Survival Group at the London School of Hygiene & Tropical Medicine set out to assess the impact of the NHS Cancer Plan and subsequent strategies on cancer survival in England and whether any gains were evenly distributed across the socio-economic groups of the population.

They analysed national cancer registry data for 3.5 million people aged 15-99 who had been diagnosed with one of the 24 most common cancers in three pre-defined calendar periods: 1996-2000, 2001-05, and 2006-13 with follow-up to 2014. This allowed comparison of trends before and after introduction of the NHS Cancer Plan.

They then estimated one year net survival for each cancer by sex, year of diagnosis, and level of deprivation. They focused on one year survival because most inequalities in cancer survival in England arise shortly after diagnosis.

They found that one year survival improved steadily from 1996 for 26 of 41 sex-cancer combinations studied. For nine additional cancers, improvement in survival was initially negligible, and really began or accelerated only later, from 2001 or 2006.

The largest improvements (greater than 1% per year) were seen for cancers that were of poor or intermediate prognosis in the 1990s, such as cancers of the oesophagus and liver in men, lung in women, and kidney, mesothelioma, and myeloma.

In contrast, survival for men diagnosed as having cancer of the larynx or testis, or Hodgkin lymphoma, was already high in the 1990s, and this improved little by 2013.

Meanwhile, the deprivation gap remained unchanged for most cancers, with a clear, persistent pattern of lower survival among more deprived patients. While the gap narrowed slightly for some cancers, where one year survival was already more than 65% in 1996, it widened notably for brain tumours in men and for lung cancer in women.

The researchers point to some limitations and say their study may be too early to detect the full impact of recently implemented cancer initiatives. Nevertheless, the study is based on virtually all cancer cases registered in England over 18 years, allowing for more accurate estimates of trends.

"Even though increasing cancer survival and reducing inequalities in survival have been among the main targets of national cancer policy initiatives implemented since 2000, this study found little evidence of a direct impact of these strategies on one year survival, and no evidence for a reduction in socioeconomic inequalities in survival," write the authors.

They say their findings "should be taken into consideration by cancer policy makers and inform future initiatives" and suggest that "shifting the focus from individual factors to healthcare system factors might prove to be beneficial in improving cancer outcomes among the most disadvantaged."

Does this mean the NHS cancer reforms failed, ask researchers at the University of Otago, New Zealand, in a linked editorial?

They point out that drivers of social disparities in cancer outcomes are complex, making it difficult to establish cause and effect, and they say genuine improvements "may require a more comprehensive approach than the NHS reforms."

To reduce the incidence and impact of cancer "we must continue to be ambitious," they write. "The goals of the NHS Cancer Plan were to save lives and to ensure that the gains were evenly shared. These goals remain critical. The methods to achieve them need more work," they conclude.

Credit: 
BMJ Group

Quantum mechanics runs hot in a cold plasma: UBC research

image: Particles quench in a disordered web of quantum interactions to form a state of many-body localization.

Image: 
Ed Grant

University of British Columbia researchers have found a new system that could help yield 'warmer' quantum technologies.

Quantum technologies such as quantum computers have the potential to process information much more quickly and powerfully than conventional computers. That prospect has spurred interest in exotic, complex quantum phenomena, particularly a state called many-body localization.

Many-body localization occurs when quantum interactions trap particles in a web-like mesh of random locations. This phase of matter protects the energy stored in quantum states from degrading to heat--an effect that could safeguard information in fragile qubits, which are the building blocks of quantum computation.

Up till now efforts to study many-body localization, both theoretically and experimentally, have focussed on quantum systems cooled to temperatures close to absolute zero, or -273° C. 

"The effect has been assumed to occur only under conditions that are very difficult to engineer," explains UBC chemical physicist Ed Grant. "So far, most evidence for many-body localization has been found using atoms arrayed in space by crossed laser fields. But arrangements like these last only as long as the light is on and are as easily disrupted as ripping a piece of tissue paper."                                                

In the latest issue of Physical Review Letters, Grant and theoretical physicist John Sous describe the results of an experiment in which laser pulses gently lift a large number of molecules in a gas of nitric oxide to form an ultracold plasma.

The plasma, consisting of electrons, ions and Rydberg molecules (NO+ ions orbited by a distant electron), self-assembles and appears to form a robust many-body localized state. The researchers believe the plasma 'quenches' to achieve this state naturally, without needing a web of laser fields - no more ripping apart. 

Just as importantly, the system doesn't have to start at a temperature near absolute zero. The mechanism of self-assembly operates naturally at high temperature, seemingly leading to a spontaneous state of many-body localization.

"This could give us a much easier way to make a quantum material, which is good news for practical applications," says Grant.

Credit: 
University of British Columbia

Saving lives by mitigating food insecurity due to droughts

Last year, 81 million people worldwide experienced severe food insecurity. About 80 percent of them live in Africa.

While much of that food insecurity relates to civil war and violence in places like South Sudan and Nigeria, a good portion also stems from a sequence of five severe droughts that began in Ethiopia in 2015 and spread across parts of the continent in the ensuing three years.

Climatologists at UC Santa Barbara's Climate Hazards Group (CHG) have been studying the relationships between these droughts and exceptionally warm sea surface temperatures in the eastern and western Pacific Ocean. Working with the Famine Early Warning Systems Network (FEWS NET) as well as scientists from the Center for Earth Resources Observation and Science and the National Oceanic and Atmospheric Administration, the multidisciplinary team has been able to deliver skillful predictions of both drought and famine that have helped reduce the effects of food insecurity. Their latest findings appear in the Quarterly Journal of the Royal Meteorologic Society.

"This work has been very personal because I was doing the same job in 2011, when more than 258,000 Somalis died during a very similar set of consecutive droughts," explained CHG research director Chris Funk, who also is a scientist with the U.S. Geological Survey's Early Warning and Environmental Monitoring Program. "Since 2011, we've been working hard to better understand the factors leading up to those droughts so that we could provide more effective early warning next time."

And that they did.

In June 2015, the team predicted that southern Africa would experience a drier-than-usual rainy season that would impact both crops and livestock in the area. Monitoring of the early season rainfall performance indicated that rainfall was late in arriving and insufficient when it finally came. Compounding this, limited governmental support and poor seed distribution diminished the opportunity to make the most of limited rainfall.

As predicted, by January 2016 the area was experiencing severe drought and the driest rainy season in 35 years. However, successful preparations helped prevent a far worse crisis. Even as southern Africa struggled to cope with a terrible growing season and devastated water supplies, another series of droughts loomed on the horizon.

"Our analysis suggests that strong El Niños may be followed by warm western Pacific sea surface temperature conditions, which can lead to conditions conducive for successive and potentially predictable east African droughts," Funk said. "Our research identifies regions of exceptionally warm sea surface temperatures that have been used to predict many recent droughts."

Then, in fall 2016, CHG climatologists again predicted a potentially devastating drought in the eastern horn of Africa, which would continue into the spring of 2017, resulting in yet another terrible sequence of back-to-back failed growing seasons across eastern Ethiopia and southern Somalia. In fact, that unprecedented lack of rainfall spread across a much larger region than in 2011.

Thanks to the team's early warning and the successful partnerships of many organizations, an extensive and effective multi-agency response began in early 2017. And despite the 2016-17 drought's severity, few deaths were attributed to it.

"Sea surface temperatures create opportunities for prediction because a really warm ocean often triggers changes in atmospheric circulation that produce droughts in some places and more rainfall in others," Funk explained. "If we pay attention and watch where those exceptionally warm sea surface temperatures are, we then can produce better drought forecasts that help prevent food insecurity in Africa."

FEWS NET's early warning system demonstrates the immense potential of bringing researchers from disparate fields together to solve a common problem. The close partnership among scientists, food security analysts and decision-makers produces new science with the power to save lives.

By developing new satellite information products and climate prediction strategies and techniques, CHG scientists in Africa and Central America build capacity in their regions that empower poor nations to better cope with climate extremes. The team is working to make the world more food secure by mapping, understanding and anticipating climate extremes.

"The bad news is that it seems like climate change is hurting people by increasing the severity of climate extremes," Funk noted. "The good news is that this type of climate change -- if we understood it correctly -- can help us predict these extremes and associated droughts, so we can be ready to adapt and mitigate their impacts."

Credit: 
University of California - Santa Barbara

'Heat not burn' smokeless tobacco product may not be as harm free as claimed

iQOS, one of the first 'heat not burn' smokeless tobacco products marketed as a safer alternative to conventional cigarettes, may not be as harm free as its manufacturer claims, suggests research published online in the journal Tobacco Control.

iQOS is a battery-operated electronic device, which mimics the looks, taste, and sensory experience of a cigarette. It contains a specially designed heat stick, which uses a tobacco plug to deliver nicotine. This is heated to temperatures well below those at which conventional cigarettes burn, producing a tobacco-infused vapour for inhalation rather than smoke.

Tobacco smoke is what contains the cocktail of chemicals that is so harmful to health.

The manufacturer, Philip Morris International, has evaluated IQOS in several published studies, but there has been little independent research.

To try and plug this gap, the US researchers set out to assess the performance of iQOS under five different puff conditions, and the impact of two cleaning protocols: a thorough clean after use of each heat stick to remove fluid and debris from the heater; and the manufacturer's recommendations to clean the device after every 20 heat sticks before using the brush cleaners supplied with the product.

The researchers also wanted to gauge if the plastic polymer film filter, which aims to cool the vapour, might pose a risk to health.

Each iQOS heat stick only lasts for 6 minutes after which it automatically shuts off and requires recharging before use. So to get the most out of each heat stick, real life users would have to shorten the interval between puffs, speeding up their puff rate, and potentially breathing in larger amounts of vapour, say the researchers.

The tobacco plug charred as a result of pyrolysis--thermal decomposition in the absence of oxygen. Charring was more extensive when thorough cleaning was not carried out after use of each heat stick, suggesting that build-up of debris and fluid increases pyrolytic temperatures, say the researchers.

Analysis of the polymer film showed that irrespective of whether cleaning was done or not, the intensity of the heat was sufficient to melt the film even though it was not in direct contact with the heating element.

Following the manufacturer's recommended cleaning instructions increased both the extent of charring and polymer film melt.

Of further concern was the release of fomaldehyde cyanohydrin by the melting filter at temperatures that all users will easily exceed, say the researchers. This chemical is highly toxic even at very low levels.

"iQOS is not strictly a 'heat not burn' tobacco product," write the researchers, who go on to say: "This study has shown that the iQOS system may not be as harm free as claimed, and also emphasises the urgent need for further safety testing as the popularity and user base of this product is growing rapidly."

Credit: 
BMJ Group

Health care reform and EHR design should be built around patients' goals

Meaningful reform of primary care should not only address the provision, documentation and payment of care; it should be based on patients' goals for their lives and health, with corresponding redesign of electronic health records. A report from an international team of primary care researchers recommends that the current problem-oriented fee-for-documentation structure of EHRs be replaced by a framework built around life and health goals. This focus would not only better serve patients; it would also help refocus medical professionals on the full scope of human health. To begin the process of creating goal-directed electronic health records, the authors suggest incorporating core patient profile and health planner functions into existing EHRs and creating linkages between patient characteristics and other parts of the EHR. If patient attributes captured by EHRs are expanded to include actionable sociocultural and socioeconomic information, life and health goals, care preferences, and personal risk factors, they can be leveraged by other EHR components so that patients and clinicians can work together to develop personalized care. The authors point out that, although numerous systemic and administrative health care innovations have been tried, the problem-oriented approach to care and its conceptual image coded into the medical record remain the same across innovations. If patient life and health goals are to drive health care and medical record design, shifts will also need to occur in health care delivery, measurement, and payment. The authors call for research into how patients and health care teams can partner effectively using goal-directed health records.

Moving From Problem-Oriented to Goal-Directed Health Records
Zsolt J. Nagykaldi, PhD, et al
University of Oklahoma Health Sciences Center, Oklahoma City, Oklahoma
http://www.annfammed.org/content/16/2/155.full

Credit: 
American Academy of Family Physicians

Individual education programs not being used as intended in special education

Gone are the days when students with disabilities were placed in a separate classroom, or even in a completely different part of the school. These students often sit alongside their traditional student peers for at least part of the day, with the help of individualized education programs (IEPs).

IEPs are considered the main drivers in special education and the mechanism through which these students receive their education to meet his or her individual learning needs. However, there are challenges to implementing them in inclusive settings. A Penn State researcher is examining the role of IEPs for students with specific learning disabilities in general education settings.

In this study, published recently in the journal Educational Evaluation and Policy Analysis, Laura Bray, assistant professor of education, explored how educators wrote, used and conceptualized the role of IEPs for students with specific learning disabilities within inclusive general education settings. "IEPs are supposed to be standards-based and tailored to the student's needs," Bray said. "We wanted to understand how teachers draw from IEPs and utilize them in their teaching."

The researchers found that IEPs are largely aligned to the general education curriculum and not individualized. "We found that students' IEPs were responding to institutional pressures to educate students within these settings," said Bray. "However, the content of the students' IEPs offered limited guidance on providing students with special education supports and services. That being said, the IEPs still played distinctive roles in each school's unique activity system for educating students within inclusive classrooms."

To come to this conclusion, the researchers examined data from a qualitative comparative case study that explored two secondary schools organized for the inclusion of students with disabilities in general education classrooms. "We looked at how IEPs were written to respond to institutional pressures to provide them a general education, how the IEPs responded to their individual education needs, other types of activities the educators would engage in to determine needs, and how IEP activities were being implemented and monitored at the schools," Bray explained.

The researchers chose two high schools in two different school districts, focusing on five students that were in 10th or 11th grade, who were identified with a specific learning disability and required modifications and accommodations in their classrooms.

Bray found that in one school, IEPs were being used as sort of a "triage" to help the students pass their courses, which is not the intent of IEPs. In the other school, Bray found that IEPs also were largely unused; however, co-teaching was used to help students with disabilities within the inclusive classrooms, and once a day these students attended a special education study hall. "We found that students in the second school had greater access to special education services," said Bray. "In both schools, while the IEPs were not being used as intended, they were still instrumental in shaping the educational supports students received."

Bray said this research is important because it will influence policy, as the Individuals with Disabilities Education Act (IDEA) is currently up for reauthorization. The IDEA ensures that children with disabilities receive free, appropriate public education and ensures special education and related services are provided to those children.

"We believe our findings will invite debate moving forward, especially as there is not much research in this area," Bray said. "This project focused on students with specific learning disabilities being educated within inclusive classrooms. In the future, we will look at other factors, such as evaluating the organization of schools for the inclusion of these students. We will also research how to better develop and implement IEPs for students being educated within inclusive settings."

Credit: 
Penn State

Breeding trouble: Meta-analysis identifies fishy issues with captive stocks

image: Co-author Dr Carolyn Hogg working with released captive born devils on Maria Island in Australia's state of Tasmania.

Image: 
Phil Wise took the photo on Dr Hogg's camera.

A group of researchers based at the University of Sydney has uncovered patterns that may be jeopardising the long-term success of worldwide animal breeding programs, which increasingly act as an insurance against extinction in conservation, and for food security.

The meta-analysis, led by the University of Sydney's Faculty of Science, found captive-born animals had, on average, almost half the odds of reproductive success compared to their wild-born counterparts in captivity.

In aquaculture, the effects were particularly pronounced, although research and conservation programs showed the same trend.

The study analysed more than 100 results, from 39 animal studies of 44 diverse species including shrimp, fish, mice, ducks, lemurs and Tasmanian devils.

The paper, "A meta-analysis of birth-origin effects on reproduction in diverse captive environments", is published today in Nature Communications.

Dr Catherine Grueber, who supervised the study, said the team was surprised at how universal the patterns were.

"More than 2,000 threatened species rely on successful reproduction through captive breeding programs for conservation alone," said Dr Grueber, from the University of Sydney's School of Life and Environmental Sciences and San Diego Zoo Global.

"In order to maintain our food supply, it's crucial we improve captive breeding; for example, the aquaculture industry is looking at introducing new species for commercialisation."

Lead author, PhD student Ms Kate Farquharson, said the results provide opportunities for improving the long-term success of animal breeding programs.

"Our dataset included measurements of lots of different reproductive traits - such as fertility, number of offspring, and timing of reproduction - but found that certain traits, such as offspring weight and mothering ability, seem to be the most strongly affected," Ms Farquharson said.

"This provides an opportunity for animal breeding programs, by identifying the areas where improvement could boost sustainability."

Research manager at the University of Sydney's Australasian Wildlife Genomics Group, co-author Dr Carolyn Hogg, said the research could be extended by undertaking multi-generational studies.

"Identifying limitations as well as opportunities in captive breeding programs across all industries is an urgent priority," Dr Hogg said.

Credit: 
University of Sydney

Patients do as well on generic antiplatelet drugs as more expensive brand-name product

DALLAS, March 13, 2018 - Generic anti-platelet drugs seem to work as well as a brand name drug for heart patients, according to new research in Circulation: Cardiovascular Quality and Outcomes, an American Heart Association journal.

When a Canadian health system switched from prescribing the brand-name anti-platelet drug Plavix® to a far-cheaper generic version, heart-attack and chest pain patients were no more likely to die from any cause or be re-hospitalized for a heart attack or unstable angina within a year than those prescribed Plavix® (17.9 percent vs. 17.6 percent).

In addition, there were no significant differences between the drug groups in the percent of patients who died or were hospitalized for any reason; had a stroke or transient ischemic attack; or who developed bleeding as a side effect of treatment.

"People can safely use generic clopidogrel. This large and real-world study should be reassuring to physicians and healthcare organizations who have been concerned about changing what is prescribed," said Dennis T. Ko, M.D., M.Sc., lead study author and senior scientist at the Institute for Clinical Evaluative Sciences (ICES) in Toronto.

In Canada and the United States, generic drugs are approved based on small studies in healthy people showing that the active ingredient is released at equivalent levels and over the same timeframe. That suggests, but doesn't prove, that the generic product will also have the same safety and medical benefit.

Clopidogrel is used to treat patients with acute coronary syndrome, percutaneous coronary intervention, stroke or peripheral vascular disease.

Researchers compared outcomes in patients (average age 77, 57 percent male) who were prescribed clopidogrel after hospitalization for a heart attack or heart-related chest pain (unstable angina) in Ontario, Canada, where the Ministry of Health began to automatically substitute generic clopidogrel for Plavix® once the brand name drug's patent expired in 2012. Between 2009 and 2014, 12,643 patients were prescribed Plavix® and 11,887 generic clopidogrel.

Plavix® cost about $2.58 Canadian dollars per pill in 2010 and was projected to cost the Ontario Drug Benefit Program $72.8 million by 2012. But by switching to a generic, which costs $0.39 per pill in 2018, the expense was only $19 million Canadian dollars.

"Plavix® was one of the most commonly used drugs in cardiology, so switching to generics can reduce a lot of cost for individuals and health systems," said Ko, who is also a cardiologist at the Schulich Heart Centre of the Sunnybrook Health Sciences Centre at the University of Toronto.

While the study was conducted in Canada, the results should apply to the United States, even if the generic drug offerings are slightly different, according to the researchers.

"There are quite a few different generic brands. In this study, we considered them as a group, but later found no differences in outcome when we compared between different generics," said co-principal investigator Cynthia Jackevicius, Pharm.D., M.Sc., professor of pharmacy at Western University of Health Sciences and senior adjunct scientist at ICES.

The American Heart Association recommends clopidogrel - sometimes in combination with other drugs - for patients who have had acute coronary syndrome (unstable angina or heart attack) or stroke.

Credit: 
American Heart Association

Lack of water is key stressor for urban trees

image: Insufficient water not only harms trees, but allows other problems to have an outsized effect on trees in urban environments.

Image: 
Emily Meineke

A study out March 13 finds that urban trees can survive increased heat and insect pests fairly well - unless they are thirsty. Insufficient water not only harms trees, but allows other problems to have an outsized effect on trees in urban environments.

"We would see some vibrant urban trees covered in scale insects, but we'd also see other clearly stressed and struggling urban trees covered in scale insects," says Emily Meineke, a postdoctoral researcher at Harvard and first author of a paper on the study. "We wanted to know what allowed some trees to deal with these pests so much more successfully."

"This is important because trees need to grow in order to perform valuable ecosystem services, such as removing pollutants from the air and storing carbon," says Steve Frank, an associate professor of entomology at North Carolina State University and co-author of the paper.

It's extremely difficult to design a field study that addresses these questions about the role of various environmental variables, given all of the uncontrolled factors in an urban environment. So the researchers used both field data and controlled laboratory experiments.

The researchers collected detailed data on 40 urban willow oaks (Quercus phellos) over the course of two years. The data included temperature, how water-stressed the trees were, and the density of scale insects. Scale insects (Parthenolecanium species) are well-known tree pests.

But the researchers also conducted laboratory experiments using willow oak saplings. In these experiments, the researchers manipulated three variables while growing the willow oaks: temperature, water and the presence of scale insects.

The researchers found that higher temperatures could actually have a positive effect on tree growth, as long as the trees had adequate water. And scale insects had little or no adverse effect on the trees if the trees were not water stressed.

The researchers also found that water stress limited tree growth all by itself. But the presence of increased heat and/or scale insects, when combined with water stress, had a multiplier effect - curtailing growth far more than water stress or scale insects alone.

"This tells us that management strategies aimed at increasing tree hydration in cities may reduce the adverse effects of all three of these key stressors," says Meineke, a former Ph.D. student in Frank's lab. "And that is likely to become increasingly important as water availability, temperature and pest abundance are affected by further urbanization and climate change."

"For example, urban planners could design urban landscapes that retain stormwater in vegetation; invest in hydration strategies, such as appropriate soil quality and soil volume; and plant drought-tolerant tree species and genotypes in the hottest parts of their cities," Frank says.

"Moving forward, we're very curious about the prevalence of water stress in urban trees globally - and whether this leads to similar problems regarding the impact of tree pests," Meineke says. "If so, improved tree hydration could become a higher priority for urban forestry management."

Credit: 
North Carolina State University

Engineers create most efficient red light-activated optogenetic switch for mammalian cells

image: Postdoctoral researcher and first author Phillip Kyriakakis demonstrates the desktop system powered by the new optogenetic switch researchers developed.

Image: 
University of California San Diego

A team of researchers has developed a light-activated switch that can turn genes on and off in mammalian cells. This is the most efficient so-called "optogenetic switch" activated by red and far-red light that has been successfully designed and tested in animal cells--and it doesn't require the addition of sensing molecules from outside the cells.

The light-activated genetic switch could be used to turn genes on and off in gene therapies; to turn off gene expression in future cancer therapies; and to help track and understand gene function in specific locations in the human body.

The team, led by bioengineers at the University of California San Diego, recently detailed their findings online in ACS Synthetic Biology.

"Being able to control genes deep in the body in a specific location and at a specific time, without adding external elements, is a goal our community has long sought," said Todd Coleman, a professor of bioengineering at the Jacobs School of Engineering at UC San Diego and one of the paper's corresponding authors. "We are controlling genes with the most desirable wavelengths of light."

The researchers' success in building the switch relied on two insights. First, animal cells don't have the machinery to supply electrons to make molecules that would be sensitive to red light. It's the equivalent of having a hair dryer and a power outlet from a foreign country, but no power cord and no power outlet adapter. So researchers led by UC San Diego postdoctoral researcher Phillip Kyriakakis went about building those.

For the power cord, they used bacterial and plant ferredoxin, an iron and sulfur protein that brings about electron transfer in a number of reactions. Ferredoxin exists under a different form in animal cells, which isn't compatible with its plant and bacteria cousin. So an enzyme called Ferredoxin-NADP reductase, or FNR, played the role of outlet adapter.

As a result, the animal cells could now transfer enough electrons from their energy supply to other enzymes that can produce the light-sensitive molecules needed for the light-activated switch.

The second insight was that the system to make light-sensitive molecules needed to be placed in the cell's mitochondria, the cell's energy factory. Combining these two insights, the researchers were able to build a plant system to control genes with red light inside animal cells.

Red light is a safe option to activate genetic switches because it easily passes through the human body. A simple way to demonstrate this is to put your hand over your smart phone's flashlight while it's on. Red light, but not the other colors, will shine through because the body doesn't absorb it. And because it's not absorbed, it can actually pass through tissues harmlessly and reach deep within the body to control genes.

Bioengineers built and programmed a small, compact tabletop device to activate the switch with red and far-red light. The tool allows researchers to control the duration that the light shines, down to the millisecond. It also allows them to target very specific locations. Researchers showed that the genes turned on by the switch remained active for several hours in several mammalian cell lines even after a short light pulse.

The team recently received an internal campus grant to use the method to control gene activation in specific regions of the brain. This would allow them to better understand gene function in a variety of neurological disorders.

The researchers patented the use of ferredoxins and FNR to target the enzymes needed to make light-activated molecules. The technology is available for licensing.

Importantly, insights about how to produce plant molecules in animal cells could also one day enable production of other molecules that can lead to the cultivation of plants that do not need fertilizer and make biofuel production more efficient.

Credit: 
University of California - San Diego

Ultra-white coating modelled on beetle scales

image: This is a Cyphochilus beetle.

Image: 
Olimpia Onelli

Researchers have developed a super-thin, non-toxic, lightweight, edible ultra-white coating that could be used to make brighter paints and coatings, for use in the cosmetic, food or pharmaceutical industries.

The material - which is 20 times whiter than paper - is made from non-toxic cellulose and achieves such bright whiteness by mimicking the structure of the ultra-thin scales of certain types of beetle. The results are reported in the journal Advanced Materials.

Bright colours are usually produced using pigments, which absorb certain wavelengths of light and reflect others, which our eyes then perceive as colour.

To appear as white, however, all wavelengths of light need to be reflected with the same efficiency. Most commercially-available white products - such as sun creams, cosmetics and paints - incorporate highly refractive particles (usually titanium dioxide or zinc oxide) to reflect light efficiently. These materials, while considered safe, are not fully sustainable or biocompatible.

In nature, the Cyphochilus beetle, which is native to Southeast Asia, produces its ultra-white colouring not through pigments, but by exploiting the geometry of a dense network of chitin - a molecule which is also found in the shells of molluscs, the exoskeletons of insects and the cell walls of fungi. Chitin has a structure which scatters light extremely efficiently - resulting in ultra-white coatings which are very thin and light.

"White is a very special type of structural colour," said paper co-author Dr Olimpia Onelli, from Cambridge's Department of Chemistry. "Other types of structural colour - for example butterfly wings or opals - have a specific pattern in their structure which results in vibrant colour, but to produce white, the structure needs to be as random as possible."

The Cambridge team, working with researchers from Aalto University in Finland, mimicked the structure of chitin using cellulose, which is non-toxic, abundant, strong and bio-compatible. Using tiny strands of cellulose, or cellulose nanofibrils, they were able to achieve the same ultra-white effect in a flexible membrane.

By using a combination of nanofibrils of varying diameters, the researchers were able to tune the opacity, and therefore the whiteness, of the end material. The membranes made from the thinnest fibres were more transparent, while adding medium and thick fibres resulted in a more opaque membrane. In this way, the researchers were able to fine-tune the geometry of the nanofibrils so that they reflected the most light.

"These cellulose-based materials have a structure that's almost like spaghetti, which is how they are able to scatter light so well," said senior author Dr Silvia Vignolini, also from Cambridge's Department of Chemistry. "We need to get the mix just right: we don't want it to be too uniform, and we don't want it to collapse."

Like the beetle scales, the cellulose membranes are extremely thin: just a few millionths of a metre thick, although the researchers say that even thinner membranes could be produced by further optimising their fabrication process. The membranes scatter light 20 to 30 times more efficiently than paper, and could be used to produce next-generation efficient bright sustainable and biocompatible white materials.

Credit: 
University of Cambridge

Off-the-shelf smart devices found easy to hack

BEER-SHEVA, Israel...March 13, 2018 - Off-the-shelf devices that include baby monitors, home security cameras, doorbells, and thermostats were easily co-opted by cyber researchers at Ben-Gurion University of the Negev (BGU). As part of their ongoing research into detecting vulnerabilities of devices and networks expanding in the smart home and Internet of Things (IoT), the researchers disassembled and reverse engineered many common devices and quickly uncovered serious security issues.

"It is truly frightening how easily a criminal, voyeur or pedophile can take over these devices," says Dr. Yossi Oren, a senior lecturer in BGU's Department of Software and Information Systems Engineering and head of the Implementation Security and Side-Channel Attacks Lab at Cyber@BGU. "Using these devices in our lab, we were able to play loud music through a baby monitor, turn off a thermostat and turn on a camera remotely, much to the concern of our researchers who themselves use these products."

"It only took 30 minutes to find passwords for most of the devices and some of them were found only through a Google search of the brand," says Omer Shwartz, a Ph.D. student and member of Dr. Oren's lab. "Once hackers can access an IoT device, like a camera, they can create an entire network of these camera models controlled remotely."

The BGU researchers discovered several ways hackers can take advantage of poorly secured devices. They discovered that similar products under different brands share the same common default passwords. Consumers and businesses rarely change device passwords when purchased so they could be operating infected with malicious code for years.

They were also able to logon to entire Wi-Fi networks simply by retrieving the password stored in a device to gain network access.

Dr. Oren urges manufacturers to stop using easy, hard-coded passwords, to disable remote access capabilities, and to make it harder to get information from shared ports, like an audio jack which was proven vulnerable in other studies by Cyber@BGU researchers. "It seems getting IoT products to market at an attractive price is often more important than securing them properly," he says.

Tips for IoT Product Security

With the goal of making consumers smarter about smart home device protection, BGU researchers offer a number of tips to keep IoT devices, families and businesses more secure:

1. Buy IoT devices only from reputable manufacturers and vendors.

2. Avoid used IoT devices. They could already have malware installed.

3. Research each device online to determine if it has a default password and if so change before installing.

4. Use strong passwords with a minimum of 16 letters. These are hard to crack.

5. Multiple devices shouldn't share the same passwords.

6. Update software regularly which you will only get from reputable manufacturers.

7. Carefully consider the benefits and risks of connecting a device to the internet.

"The increase in IoT technology popularity holds many benefits, but this surge of new, innovative and cheap devices reveals complex security and privacy challenges," says Yael Mathov, who also participated in the research. "We hope our findings will hold manufacturers more accountable and help alert both manufacturers and consumers to the dangers inherent in the widespread use of unsecured IoT devices."

Credit: 
American Associates, Ben-Gurion University of the Negev

Researchers computationally find the needle in a haystack to treat rare diseases

image: Chemotherapeutic vandetanib bound to its main target Protein Tyrosine Kinase 6 (PTK6) in purple, which is involved in many cancers including gastrointestinal tumors and ovarian cancers. By modeling vandetanib and PTK6 complex, researchers at LSU found that the KRAS protein to also contain a similar drug-binding site and therefore to be a good match for the same drug. The computer-generated model of KRAS in gold with vandetanib depicts the predicted interaction.

Image: 
Misagh Naderi, LSU

One in 10 people in America is fighting a rare disease, or a disorder that affects fewer than 200,000 Americans. Although there are more than 7,000 rare diseases that collectively affect more than 350 million people worldwide, it is not profitable for the pharmaceutical industry to develop new therapies to treat the small number of people suffering from each rare condition. Researchers at the LSU Computational Systems Biology group have developed a sophisticated and systematic way to identify existing drugs that can be repositioned to treat a rare disease or condition. They have fine-tuned a computer-assisted drug repositioning process that can save time and money in helping these patients receive effective treatment.

"Rare diseases sometimes affect such a small population that discovering treatments would not be financially feasible unless through humanitarian and governmental incentives. These conditions that are sometimes left untreated are labeled 'orphan diseases.' We developed a way to computationally find matches between rare disease protein structures and functions and existing drug interactions that can help treat patients with some of these orphan diseases," said Misagh Naderi, one of the paper's lead authors and a doctoral candidate in the LSU Department of Biological Sciences.

This research will be published this week in the npj Systems Biology and Applications journal, published by the Nature Publishing Group in partnership the Systems Biology Institute.

"In the past, most repurposed drugs were discovered serendipitously. For example, the drug amantadine was first introduced to treat respiratory infections. However, a few years later, a patient with Parkinson's disease experienced a dramatic improvement of her disease symptoms while taking the drug to treat the flu. This observation sparked additional research. Now, amantadine is approved by the Food Drug Administration as both an antiviral and an antiparkinsonian drug. But, we can not only rely on chance to find a treatment for an orphan disease," said Dr. Michal Brylinski, the head of the Computational Systems Biology group at LSU.

To systematize drug repurposing, Naderi, co-author Rajiv Gandhi Govindaraj and colleagues combined eMatchSite, a software developed by the same group with virtual screening to match FDA approved drugs and proteins that are involved in rare diseases. LSU super computers allows them to test millions of possibilities that will cost billions of dollars to test in the lab.

Credit: 
Louisiana State University

Algorithm could streamline harvesting of hand-picked crops

image: Although there has been other research on precision agriculture in recent years, this study specifically addresses crops, which are currently picked by hand.

Image: 
University of Illinois Department of Industrial and Enterprise Systems Engineering

Farmers are the latest beneficiaries in a world of data analytics. Over the past few years, precision agriculture has been helping farmers make smarter decisions and producing a bigger yield. But most of the studies to date have been in row crops harvested by large machines, made possible by data collected by drones and other means. However, Richard Sowers, a professor of industrial and enterprise systems engineering and mathematics at the University of Illinois at Urbana-Champaign, and a team of students have developed an algorithm that promises to give valuable information to farmers of crops picked by hand.

Sowers, along with students Nitin Srivastava and Peter Maneykowski have developed an algorithm which will help streamline the workforce of highly perishable hand-picked crops. Their paper, Algorithmic Geolocation of Harvest in Hand-Picked Agriculture, which will appear in Natural Resource Modeling, presents the results of a study conducted at the harvest of strawberry patches at Crisalida Farms in Oxnard, Calif. Less than a year ago, Sowers co-authored a paper titled, Hand-picked specialty crops 'ripe' for precision agriculture techniques, addressing the timing and transport of such crops.

"The strawberries that you put on your ice cream or cereal are for the moment picked by a crew of 10 or so workers, who mostly earn a wage per box collected," Sowers noted. "For the consumer, it important that the strawberries are of good quality and look nice."

According to Sowers, the strawberries that appear in clam shells that you find at the market or at your local grocery store are largely in the same condition as they were when they were picked from the field. They are loaded in a box, then a bigger box, then on a pallet and finally onto a truck. The process is then reversed at the market.

"One of the aspects that I'm interested in is the fact that there are humans involved in picking," Sowers said. "Just like Internet browsing history differs from person to person, along similar lines, a workers' ability to harvest strawberries is different. This brings up the question: how do you think about data in that industry? Because the human variability has a huge effect.

"Figuring out what is going on in the field is an important question," he added. "Identifying that certain parts of the field are producing a higher or lower quality harvest can be valuable in harvest strategy."

Rather than requiring a worker to enter data during harvest, which would slow down the process, Sowers' team was able to pinpoint exact movement of each worker through GPS tracking on a smart phone each carried with them. Based on that data, the team developed an algorithm to predict the amount of completed boxes.

The data promises to ultimately lead to more precision techniques for harvesting. For instance, one set of quality control typically occurs at the edge of field and oftentimes there is a backlog of workers waiting in the que. More data will better help plan for best times to provide this control as well as schedule forklifts to pick up pallets and put them in a cooler. Time is of the essence as hot weather can have a dramatic effect to the quality of the produce.

"At the moment, we're just trying to track," Sowers noted. "You can't manage what you can't measure. We're trying to measure what is going on in the field actually in the field, not at the edge of the field where data is currently being collected. If you know moment by moment how much is being harvested, you can better schedule, rearrange harvest crews or re-task."

Sowers further iterates the importance of this measurement to the industry because miscalculation of the workforce could completely eliminate profit.

"If that happens, all the nutrients that went into it (water, fertilizers, nitrogen, etc.) is just wasted," he said. "If you can better allocate resources and prevent or lessen the time that some of those stacks of berries are sitting in the field, that's a win."

The team successfully proved that these behaviors can be tracked and analyzed and is planning to return to California to refine it.

"There is a more and more appreciation for data in this industry," Sowers said. "I'd like to go back and do this on a larger scale so that we can try to compare this to something which is at a production grade. In order to actually have an impact, we need to understand and process the data at a level of certainty which is as good as or comparable to what is needed to actually make some decisions for re-allocating people and for optimizing the layout of fields."

Credit: 
University of Illinois Grainger College of Engineering