Culture

First measurement of electron energy distributions, could enable sustainable energy technologies

To answer a question crucial to technologies such as energy conversion, a team of researchers at the University of Michigan, Purdue University and the University of Liverpool in the U.K. have figured out a way to measure how many "hot charge carriers"--for example, electrons with extra energy--are present in a metal nanostructure.

"For example, if you wanted to employ light to split water into hydrogen and oxygen, you can use hot charge carriers because electrons that are more energetic can more readily participate in the reaction and drive the reaction faster. That's one possible use for hot carriers in energy conversion or storage applications," said Edgar Meyhofer, a professor of mechanical engineering at U-M, who co-led the research.

Vladimir Shalaev, a professor of electrical and computer engineering, led the contribution from Purdue. The findings also confirm that thinner metals are more efficient at using light for generating hot charge carriers. Light can drive the motion of electrons on the surfaces of materials such as gold and silver, creating waves known as surface plasmons. These waves, in turn, can generate hot charge carriers.

The researchers compared the usual distribution of charge carrier energies to air at room temperature: The molecules in air do not all have the same energy--their average energy is reflected by the temperature. The energies of negatively-charged electrons and positively-charged holes ordinarily follow similar distributions within a material. But in materials that support surface plasmons, light can be used to give extra energy to some charge carriers as though the material were much hotter--more than 2,000 degrees Fahrenheit.

The team created the hot charge carriers by shining laser light onto a gold film just 13 nanometers thick, or hundred or so gold atoms thick, with tiny ridges spaced so that they would resonate with the laser light and generate the surface plasmon waves. Then they measured the energies of the charge carriers by drawing them through gatekeeper molecules into a gold electrode--the tip of a scanning tunneling microscope.

The key to the experiment is those gatekeeper molecules, which were synthesized by the Liverpool team as well as a private company. The molecules allow only charge carriers with certain energies to pass. By repeating the experiments with different molecules, the researchers figured out the energy distribution of the charge carriers.

"Electrons can be thought of as cars running at different speeds on a highway. The molecule acts like an operator--it only allows cars travelling at a certain speed to pass through," said Kun Wang, a postdoctoral fellow in Meyhofer's group.

The researchers also compare it to a prism that separates the spectrum of electron energies rather than the colors in light.

Wang spent more than 18 months working with Harsha Reddy, a Ph.D. student in electrical and computer engineering at Purdue, on how to make this idea work.

"This idea of molecular filters was something no one else in the field has realized in the past," said Reddy, who works in Shalaev's lab.

Once they had developed a successful method, Wang and Reddy repeated the experiments with a second gold structure, this one about 6 nanometers thick. This structure generated hot charge carriers more efficiently than the 13 nanometer version.

"This multidisciplinary basic research effort sheds light on a unique way to measure the energy of charge carriers. These results are expected to play a crucial role in developing future applications in energy conversion and photocatalysis and photodetectors, for instance, that are of great interest to the Department of Defense," said Chakrapani Varanasi, program manager of the team's Multidisciplinary University Research Initiative funded by the Army Research Office.

With the method now demonstrated, the team believes that others can use it to explore and optimize nanostructures. This is important in applications such as converting sunlight to chemical energy because the number of hot charge carriers affects how well a catalyst can direct light energy toward a chemical reaction.

Credit: 
University of Michigan

Air conditioner bumps the electric bill by 42%, increasing the risk of energy poverty

image: A new study by Ca' Foscari and CMCC combines OECD and NASA datasets for 8 countries to show that the share of households' spending dedicated to cooling is greater to what estimated in previous studies.

Image: 
Gaia Squarci

A new study published in Economic Modelling by researchers at Ca' Foscari University and CMCC shows that owning and using an Air Conditioner greatly increases the electricity bills of households, with important implications for the energy poverty of the less well-off.

Previous studies, mainly focused on the US, estimated an increase of household spendings for electricity bills of about 11%. This new study, analysing the socio economic characteristics of households in eight other OECD countries (Australia, Canada, France, Japan, the Netherlands, Spain, Sweden, and Switzerland) and climate data coming from a NASA dataset, finds that on average, using an AC brings 42% more spending for electricity bills, with respect to the ones who do not have an AC unit in their home.

The actual increases will depend on the intensity of the change in climate households will have to face in the future. Those additional spendings are thus a new factor influencing the energy poverty of the poorest households, a situation arising when the families spend more than 5% of their annual income on electricity.

According to BPIE, in 2014 the population already affected by fuel poverty in Europe ranged from 10% to 15%, depending on the member state. This new study shows a more worrying situation.

"The concept of energy poverty is usually related to ensuring adequate heating during the coldest months - explains Enrica De Cian, professor of Environmental Economics at Ca' Foscari and leader of the Energya team which drafted the study. - Our data, however, indicate that we should widen the concept to include the increasing role of cooling during the summer months. Poorest households already spend a consistent share of their budget for basic goods such as food and electricity. The latest will have to increase to ensure adequate protection of our health especially among the most vulnerable members of households during heatwaves."

Owning an AC has already important implications for the energy expenditures of households, up to the scale of countries and beyond, with great variations across countries: it represents about 11% of the total final energy use in buildings in the US, while only 1.2% in Europe.

"Our analysis reveals that in Spain 18.5% of households already spend more than 5% of their annual budget in electricity - confirms the Venitian professor. Those percentages are generally higher in coldest countries, reaching for instance 24.2% in Sweden. In France and Switzerland, we find lower numbers, respectively 8% and 5%.

Who uses ACs and why

"The innovative element of this work - adds Teresa Randazzo, first author of the study - is that we take into account drivers of AC adoption and use in households that are difficult to observe and measure, such as the personal perception of thermal comfort, the risk aversion, or the environmental awareness."

The study disentangles the various characteristics of individuals and households to point out to the ones leading - or not - to a wider AC adoption. For instance, a larger share of younger members brings a wider adoption of AC, while more educated individuals tend to use those appliances less, suggesting they are more aware of the impact of energy on the environment.

Similarly, households that are more accustomed to adopting energy-saving behaviors are less likely to adopt AC. On the opposite, those with a high number of appliances tend to have a higher propensity for AC - which may be an indication that those used to higher standards of comfort are also more inclined to adopt AC.

"Living in an urban area increases the probability of having AC by 9 percentage points, a sizable effect compared to the role of income and climate, probably due to the heat island effect in cities" adds Malcolm Mistry, responsible for the climate data analyses for the Energya project and co-author of this study.

Data analysis of households and climate

To understand the dynamics of AC adoption in industrialized countries and its impact on the budget of households in the light of climate change, the Energya team examined eight OECD countries spanning across mid-latitudes: Australia, Canada, France, Japan, the Netherlands, Spain, Sweden, and Switzerland.

To do so, the researchers combined the information of 3,615 geocoded households from a dataset released by the OECD with a historical climate dataset based on NASA-GLDAS data. "Our elaboration of this climate dataset includes Cooling Degree-Days (CDDs) for the last 49 years, an indicator commonly used in the literature to capture the typical intensity and duration of warm days, and the corresponding cooling requirements" explains Malcolm Mistry.

AC global trends

From 1990 to 2016 global annual sales of air conditioners more than tripled to reach 135 million units worldwide, with figures from the residential sector alone underscoring the trend. China leads, with 41 million residential units registered, followed by 16 million in the US, and roughly 9 million in both Japan and Europe. Penetration of air conditioning in households is expected to continue to increase sharply, because of climate change and thanks to increasing standards of living, reaching 21% in Spain and 35% in France in 20 years from now" concludes prof. De Cian.

Credit: 
Università Ca' Foscari Venezia

Acute kidney injury and end stage kidney disease in severe COVID-19

Germany was not hit as hard by the SARS-CoV-2 pandemic as many other European countries. The reason for this is that the wave of infection reached Germany later, that the authorities were warned by the situation in Italy and Spain and at an early stage ordered a lockdown and conducted extensive testing. Up to June 2, 2020, there were 182,028 cases of COVID-19 and 8,522 deaths. As in other countries, dialysis patients were at high risk, due to the fact that they often are of older age, have more comorbidities and, of course, suffer from an impaired immune system. A practical problem which adds to the risk is that they cannot stop their treatment and they have to go to a dialysis unit 3 times a week, this means: strict home isolation is not possible for these patients at risk.

A registry has been created in Germany to investigate the prevalence and outcome of SARS-CoV-2 infected dialysis patients. By the end of May, about 2% of the registered dialysis patients (about 300 people out of 14,000) had been tested positive for SARS-CoV-2, and, as it turned out, these patients had a poor prognosis: The mortality rate was around 20%. Phase 2 of the Registry will also include patients with acute kidney injury and chronic kidney disease, and will investigate outcomes and prognostic factors.

However, not only are patients with kidney disease at a higher risk of becoming infected with the coronavirus and have a more severe course of COVID-19 - data also suggest, that the kidneys might be a target organ of this viral disease.

Early data from China on COVID-19 included some startling revelations: Kidney involvement seems to be frequent in people who have been tested positive and have developed symptoms. A consecutive cohort study [1] of COVID-19 patients admitted in a tertiary teaching hospital with 3 branches following a major outbreak in Wuhan in 2020 analysed hematuria, proteinuria, serum creatinine concentration and other clinical parameters as well as the incidence rate for acute kidney injury (AKI). On admission, 44% of the patients had proteinuria and 26.7% had hematuria. AKI occurred in 5.1% patients. After adjustment for confounders, all kidney impairment indicators were associated with higher risk of in-hospital death. The authors recommended already in February that clinicians should increase their awareness of kidney impairment in hospitalized COVID-19 patients.

Indeed, one other study [2] showed that the incidence of AKI is significantly increased among hospitalized patients: Of 4259 patients not requiring mechanical ventilation, 925 had AKI (any stage) and nine needed kidney replacement therapy. The rate was significantly higher among ventilated patients - of 1190 patients, 276 (23.2%) needed dialysis treatment. The main conclusion drawn by the authors is that "AKI occurs frequently among patients with COVID-19. It occurs early and in temporal association with respiratory failure and is associated with a poor prognosis."

It is obvious that the COVID-19 causes kidney injury. In an autopsy study conducted in Hamburg [3], samples from different organ tissues of 27 autopsied COVID-19 patients were analyzed for viral load. It was found that, although the lungs are worst affected by the novel virus, other organs and especially the kidneys are also affected. The samples from seven patients were also used to investigate which renal compartments are particularly affected, and it was shown that the renal tubules and especially the glomerular cells had a high viral load. "These findings are consistent with clinical observations. The glomeruli perform the filtration function of the kidneys and the tubules are responsible for reabsorption. It has been found that, early in the course of Covid-19, many patients had abnormalities in their urine, in particular proteinuria", explained Dr. Hoxha at the press conference to launch the ERA EDTA Congress. "The question is how these findings can be used."

A study group from Göttingen [4], which cooperates closely with groups in Hamburg, Cologne and Aachen, is currently investigating whether early signs of kidney involvement, such as proteinuria, hypoproteinemia and antithrombin III deficiency allow early risk assessment and stratification of patients. Such patients would be at higher risk for developing complications such as lung oedema and thrombembolisms, such as the dreaded pulmonary embolisms. Both could then be treated prophylactically in patients at risk. A recently launched study [5] is now being conducted to investigate the prognostic significance of kidney parameters.

Credit: 
ERA – European Renal Association

A newly discovered disease may lead to better treatment of cystic fibrosis

image: Comparison between the normal TMEM16A protein and the mutated variant showing the truncating effect that leads to the loss of vast portions of the protein. This leads to severe structural alterations.

Image: 
J. Park et al. 2020/ <i>Journal of Medical Genetics</i>

Cystic fibrosis is the most frequent severe inherited disorder worldwide. Every year, hundreds of families are confronted with this diagnosis - and to date, there is no cure for this disease that mainly affects the respiratory system. Besides supportive treatments, a lung transplant is often the only option to save a patient's live. Researchers of the Universities of Münster and Regensburg have now discovered a novel disease that might lead to a better understanding of cystic fibrosis and new treatment options in the future. The results have been published in the scientific journal Journal of Medical Genetics.

The cause of cystic fibrosis are mutations in the cystic fibrosis transmembrane conductor regulator gene (CFTR). This gene contains the blueprint for a chloride channel on the surface of cells in the body. Normally, this channel mediates the accumulation of salt and fluids on the surface of the airways thereby leading to a continuous cleaning of the airways. Defects in the CFTR channel prevent the transport of chloride ions and thus the humidification of the respiratory tract. As a result, the airways of affected individuals literally get plugged by a thickened, viscous mucus that leads to airway obstruction - patients are at the risk of suffocating.

At the University of Münster, the lab of Prof. Thorsten Marquardt has now discovered a new disease that is caused by defects in another chloride channel, TMEM16A. This channel is also present on the surface of airway cells. In cooperation with the lab of Prof. Karl Kunzelmann of the University of Regensburg, the researchers evaluated the cellular effects of the disorder that is caused by a total loss of TMEM16A function. Surprisingly, they discovered that not only TMEM16A but also CFTR is not functional in these patients. Excitingly, this has the potential to improve the treatment of patients suffering from cystic fibrosis.

"We were astonished that children with TMEM16A deficiency don't have any respiratory symptoms at all. A loss of CFTR function due to lack of TMEM16A does not lead to clinincal symptoms of cystic fibrosis in these kids", states Dr. Julien Park, first author and researcher at the Marquardt lab at the Department of General Pediatrics at the University Hospital Münster. Similarly, the group of Prof. Karl Kunzelmann found in a mouse model that a double knock out of CFTR and TMEM16A does not develop lung disease.

Taken together, these results raise an intriguing question: Could the pharmacological inhibition of TMEM16A improve the respiratory symptoms of patients with cystic fibrosis? A significant reduction of mucus production and secretion as a consequence of TMEM16A inhibition has previously been shown under laboratory conditions. The researchers want to study this approach further in the future: "As a next step, we are planning clinical trials to evaluate a treatment of cystic fibrosis with TMEM16A inhibitors", states Karl Kunzelmann.

Credit: 
University of Münster

Study identifies potential approach to treat patients with severe COVID-19

Early data from a clinical study suggest that blocking the Bruton tyrosine kinase (BTK) protein provided clinical benefit to a small group of patients with severe COVID-19. Researchers observed that the off-label use of the cancer drug acalabrutinib, a BTK inhibitor that is approved to treat several blood cancers, was associated with reduced respiratory distress and a reduction in the overactive immune response in most of the treated patients.

The findings were published June 5, 2020, in Science Immunology. The study was led by researchers in the Center for Cancer Research at the National Cancer Institute (NCI), in collaboration with researchers from the National Institute of Allergy and Infectious Diseases (NIAID), both part of the National Institutes of Health (NIH), as well as the U.S. Department of Defense's Walter Reed National Military Medical Center, and four other hospitals nationally.

These findings should not be considered clinical advice but are being shared to assist the public health response to COVID-19. While BTK inhibitors are approved to treat certain cancers, they are not approved as a treatment for COVID-19. This strategy must be tested in a randomized, controlled clinical trial in order to understand the best and safest treatment options for patients with severe COVID-19.

The BTK protein plays an important role in the normal immune system, including in macrophages, a type of innate immune cell that can cause inflammation by producing proteins known as cytokines. Cytokines act as chemical messengers that help to stimulate and direct the immune response. In some patients with severe COVID-19, a large amount of cytokines are released in the body all at once, causing the immune system to damage the function of organs such as the lungs, in addition to attacking the infection. This dangerous hyperinflammatory state is known as a "cytokine storm." At present, there are no proven treatment strategies for this phase of the illness. The study was developed to test whether blocking the BTK protein with acalabrutinib would reduce inflammation and improve the clinical outcome for hospitalized patients with severe COVID-19.

This prospective off-label clinical study included 19 patients with a confirmed COVID-19 diagnosis that required hospitalization, as well as with low blood-oxygen levels and evidence of inflammation. Of these patients, 11 had been receiving supplemental oxygen for a median of two days, and eight others had been on ventilators for a median of 1.5 (range 1-22) days.

Within one to three days after they began receiving acalabrutinib, the majority of patients in the supplemental oxygen group experienced a substantial drop in inflammation, and their breathing improved. Eight of these 11 patients were able to come off supplemental oxygen and were discharged from the hospital. Although the benefit of acalabrutinib was less dramatic in patients on ventilators, four of the eight patients were able to come off the ventilator, two of whom were eventually discharged. The authors note that the ventilator patient group was extremely clinically diverse and included patients who had been on a ventilator for prolonged periods of time and had major organ dysfunction. Two of the patients in this group died.

Blood samples from patients in the study showed that levels of interleukin-6 (IL-6), a major cytokine associated with hyperinflammation in severe COVID-19, decreased after treatment with acalabrutinib. Counts of lymphocytes, a type of white blood cell, also rapidly improved in most patients. A low lymphocyte count has been associated with worse outcome for patients with severe COVID-19. The researchers also tested blood cells from patients with severe COVID-19 who were not in the study. In comparison with samples from healthy volunteers, they found that these patients with severe COVID-19 had higher activity of the BTK protein and greater production of IL-6. These findings suggest that acalabrutinib may have been effective because its target, BTK, is hyperactive in severe COVID-19 immune cells.

Credit: 
NIH/National Cancer Institute

Editorial: COVID-19 pandemic likely to result in lasting changes to medical school curricula

Editorial: COVID-19 Pandemic Likely to Result in Lasting Changes to Medical School Curricula

Following disruptions to medical education that the COVID-19 pandemic brought to the United States this spring, "a return to a typical pre-COVID-19 teaching platform is unlikely," say Diane Wayne and colleagues in this Editorial. They suggest that "many creative changes are here to stay," including large-scale adoption of online education, noting that even faculty members who were previously resistant to remote learning now have evidence of technology's ability to meet the needs of pre-clinical students. Wayne and colleagues also note that adaptations made in recent months to ensure more focus on topics such as telemedicine, pandemic modeling, health equity, and population and public health could persist in medical school curricula going forward, helping to better prepare students for future pandemics and other unexpected medical scenarios. The knowledge that medical education may never be the same again will mean accrediting agencies will have to join in the adaptation effort, they write. "We now have an opportunity to create a better medical school experience with improved flexibility and outcomes that still ensures competence from this increasingly complex effort."

Credit: 
American Association for the Advancement of Science (AAAS)

Approved drug may help calm cytokine storm in COVID-19

The drug acalabrutinib, FDA-approved for the treatment of several types of B cell cancers, improved the oxygenation levels and decreased molecular markers of inflammation in a majority of 19 patients hospitalized for the treatment of severe COVID-19, according to a new study by Mark Roschewski and colleagues. The drug was administered to 11 patients on supplemental oxygen and 8 patients on mechanical ventilation over a 10-to-14-day course of treatment. At the end of treatment, 8 of 11 patients on supplemental oxygen were breathing room air, and 4 of 8 patients on ventilation were extubated, with 2 of the 8 breathing room air. Measurements of two proteins related to inflammation decreased in the majority of patients, with no signs of toxicity from the drug. The study is not a clinical trial, but rather an off-label observational study to see if acalabrutinib could help dampen the massive immune response - sometimes called a "cytokine storm" - that is associated with the most severe cases of COVID-19. Acalabrutinib inhibits the Bruton tyrosine kinase (BTK) protein, which aids immune cells called macrophages in activating a variety of other proteins in the body's innate immune response. Patients with severe COVID-19 have a hyperinflammatory immune response that appears to be driven by macrophage activation, leading to acute respiratory distress syndrome (ARDS) and often death. Roschewski et al. also studied BTK activation and immune markers in whole blood from 4 COVID-19 patients and 5 healthy individuals. BTK activation levels and the presence of the inflammatory protein IL-6 were higher in the COVID-19 patients, further suggesting that BTK may play a critical role in the disease's progression. An international prospective randomized controlled clinical trial is now underway to confirm the safety and efficacy of this BTK inhibitor as a therapeutic strategy against COVID-19, the authors note.

Credit: 
American Association for the Advancement of Science (AAAS)

Eclipse data illuminate mysteries of Sun's corona

image: White-light images of the solar corona during the 2019 total solar eclipses in Chile.

Image: 
Solar Wind Sherpas

Researchers at the University of Hawai?i Institute for Astronomy (IfA) have been hard at work studying the solar corona, the outermost atmosphere of the sun that expands into interplanetary space. The properties of the solar corona are a consequence of the Sun's complex magnetic field, which is produced in the solar interior and extends outward into space.

IfA graduate student Benjamin Boe conducted a new study that used total solar eclipse observations to measure the shape of the coronal magnetic field with higher spatial resolution and over a larger area than ever before. The results were published in the Astrophysical Journal on June 3.

The corona is most easily seen during a total solar eclipse -- when the moon is directly between the Earth and Sun, blocking sunlight. Significant technological advances in recent decades have shifted a majority of analysis to space-based observations at wavelengths of light not accessible from the ground, or to large ground-based telescopes such as the Daniel K. Inouye Solar Telescope on Maui. Despite these advances, some aspects of the corona can only be studied during total solar eclipses.

Boe was advised by UH Mānoa Astronomy Professor Shadia Habbal, a coronal research expert. Habbal has led a group of eclipse chasers, the Solar Wind Sherpas making scientific observations during solar eclipses for more than 20 years. These observations have led to breakthroughs in unveiling some of the secrets of the physical processes defining the corona.

"The corona has been observed with total solar eclipses for well over a century, but never before had eclipse images been used to quantify its magnetic field structure," explained Boe. "I knew it would be possible to extract a lot more information by applying modern image processing techniques to solar eclipse data."

Boe traced the pattern of the distribution of magnetic field lines in the corona, using an automatic tracing method applied to images of the corona taken during 14 eclipses the past two decades. This data provided the chance to study changes in the corona over two 11-year magnetic cycles of the Sun.

Boe found that there were very fine-scale structures throughout the corona. Higher resolution images showed smaller-scale structures, implying that the corona is even more structured than what was previously reported. To quantify these changes, Boe measured the magnetic field angle relative to the Sun's surface.

During periods of minimum solar activity, the corona's field emanated almost straight out of the Sun near the equator and poles, while it came out at a variety of angles at mid-latitudes. During periods of maximum, the coronal magnetic field was far less organized and more radial.

"We knew there would be changes over the solar cycle but we never expected how extended and structured the coronal field would be," Boe explained. "Future models will have to explain these features in order to fully understand the coronal magnetic field."

These results challenge the current assumptions used in coronal modeling, which often assume that the coronal magnetic field is radial beyond 2.5 solar radii. Instead, this work found that the coronal field was often non-radial to at least 4 solar radii.

This work has further implications in other areas of solar research--including the formation of the solar wind, which impacts the Earth's magnetic field and can have effects on the ground, such as power outages.

"These results are of particular interest for solar wind formation. It indicates that the leading ideas for how to model the formation of the solar wind are not complete, and so our ability to predict and defend against space weather can be improved," Boe said.

Boe is already planning to be part of his team's next eclipse expeditions. The next one is slated for South America in December 2020.

Credit: 
University of Hawaii at Manoa

Tillage and cover cropping effects on grain production

image: Lead author of the study Dr. Gurbir Singh collecting hairy vetch cover crop biomass samples before termination of cover crops.

Image: 
Gurpreet Kaur

June 4, 2020 - Incorporating cover crops with tillage reportedly results in increased cover crop decomposition rates and increased mineralization of nutrients from cover crop biomass. Multiple studies have reported mixed results for corn-soybean grain yields when planted after cover crops.

In an article recently published in Agronomy Journal, researchers reported results of a four-year study on corn-soybean rotation. Treatments were completed under either conventional tillage or no-tillage, and with and without cover crop. Researchers included three rotations: cereal rye-soybean-hairy vetch-corn, cereal rye-soybean-oat+radish-corn, and no cover crop-soybean-no cover crop-corn.

The team found that rotation with hairy vetch as a preceding cover crop increased corn grain yield by 14.09 and 12.35% compared to rotations having no cover crop and oat+radish as preceding cover crops, respectively. Cereal rye cover crop biomass had 16-20 kg ha-1 greater N uptake compared to winter annual weeds in rotation without cover crops. Researchers reported that cereal rye preceding soybean reduced soybean yield by 0.3 to 0.6 Mg ha-1 compared to soybean following no cover crop.

Cover crops are touted for their soil and water quality related benefits. However, their adoption and success will depend on the selection of cover crop species available that do not reduce grain yields in following crops.

Credit: 
American Society of Agronomy

Can't concentrate at work? This AI system knows why

image: The project team (left to right): Yongli Ren, Mohammad Saiedur Rahaman, Shaw Kudo, Tim Rawling, Flora Salim and Jonathan Liono.

Image: 
Arup

Computer scientists have developed a way to measure staff comfort and concentration in flexible working spaces using artificial intelligence.

While hot desking and activity-based working allow cost savings and greater flexibility - and are said to increase staff collaboration and satisfaction - studies also show the noise and lack of privacy can be distracting.

With coronavirus restrictions beginning to ease in some parts of the world and employers planning the return to office-based work, a new sensor-based system developed by RMIT and Arup can offer insights on how to get the best out of these flexible working spaces.

The RMIT team behind the study are experts in using AI to uncover patterns in human behaviour.

For this project they worked with psychologists to identify several key variables for concentration and comfort levels in work environments, then set about measuring these with sensors.

They worked with global design and engineering firm Arup to develop and test their new AI-driven system on 31 staff in two of the company's activity-based working offices over four weeks.
Study lead author and Research Fellow in RMIT University's School of Science, Dr Mohammad Saiedur Rahaman, said data was collected on noise levels, indoor temperature and air quality, humidity, air pressure, and even electromagnetic fields.
"We used that information along with survey data to train machine learning algorithms that could identify patterns in perceived concentration and activity, and then provided solutions for making these spaces work best for people," Rahaman said.

What they found

Staff were generally supportive of their activity-based working setup.

However, data showed different people concentrated better in different zones, as well as other important insights for managing staff in the space.

For example, many people had a favourite spot - such as near the window, kitchen or their manager - and found concentrating more difficult if they weren't able to sit there. They were also more sensitive to the office temperature not being exactly right if they missed out on their favourite seat.

Regardless of where they sat, office temperature was a major factor in how comfortable and focused people were.

Most found temperatures below 22.5C too cold to fully concentrate and, as the day progressed, it was observed that people became increasingly sensitive to this.

A major influence on perceived concentration in the mornings, unsurprisingly, was sleep quality the night before.

The number of formal and informal meetings was also shown to have a large impact on perceived concentration, with those who had five formal meetings in a day reporting lower concentration levels compared with those who had fewer.

'Informal meetings' - run-ins encouraged by activity based working - were also measured. While they were preferred by some workers and could be used to reduce the number of formal meetings, they were seen as another source of distraction for others.

Rahaman said high CO2 levels, due to high occupant densities, were also a barrier in people's ability to concentrate.

"The results for CO2 and thermal comfort underline just how important a high-quality heating, cooling and ventilation system is in office design, as well as indoor plants to reduce CO2," Rahaman said.

Making work spaces work better

"We see this type of system having the potential to eventually be used to enable informed decision-making regarding workplace design and layout, or even to suggest to people when to take breaks, what zone might suit them best and so on," Rahaman said.

Arup engineer and project partner, Shaw Kudo, said beyond the useful insights on their own office, they also saw it as an opportunity to help the wider property industry.

"Modern offices, new and existing, are likely to undergo change and potentially redesign workplaces post COVID-19," he said.

"The valuable findings from this work can feed into future designs and allow Arup to better service our clients as they plan their future workplace - whether this is a new-build, or a return to the office after COVID-19.'

Fellow Arup engineer Tim Rawling said they were also looking to adapting the work to assess the impact of working from home on people's work experience given variability in spaces.

"Given the changing landscape of work environments, we’re excited by the opportunity to explore application of this research to new working environments and flexible working arrangements,” he said.

Study leader from RMIT's School of Science, Associate Professor Flora Salim, said recent technological developments and the proliferation of pervasive technologies had opened up many opportunities to collect data from various sensors and smart devices.

"Despite the myriad applications harnessing this data for smart decision-making systems, this is the first research we're aware of that has used pervasive sensing passively to measure workers perceived concentration levels while they are at work," Salim said.

"We hope it can make a real contribution to work practices that reflect what people need to perform their best."

Credit: 
RMIT University

Strategic redundancy can prevent collapse of supply chains during global crises

image: Strategic redundancy can prevent the collapse of supply chains during global crises.

Image: 
University of Texas at Austin

AUSTIN, Texas -- When the novel coronavirus began spreading during the early months of 2020, it put kinks in multinational production chains -- first in China and then around the globe. But it didn't have to happen that way, according to Francisco Polidoro, associate management professor at the McCombs School of Business at The University of Texas at Austin.

In a forthcoming paper published online in advance by the Academy of Management Review, he suggests companies use redundancy as a way to fortify their operations against unforeseeable events such as pandemics.

It's a matter of preparing for the unexpected.

Unlike risk, which covers events that have happened before and could strike again, uncertain events lack any data points to inform decisions. Uncertainty refers to what you do not even know that you don't know.

Polidoro, with co-authors Curba Lampert of Florida International University in Miami and Minyoung Kim of the University of Kansas, introduces a strategy called branching. A company, say the researchers, can build multiple branches into its value chain, the string of steps that lead from research and development through manufacturing and sales. When a crisis strikes one branch, the overall chain can keep running.

"With branching, the idea is to have a kind of redundancy by design. It makes you more resilient. If you wait until your value chain is disrupted, it may be too late," Polidoro said. "You invest in flexibility before you need it."

Apple Inc., for example, which assembles iPhones in China, struggled with both manufacturing and distribution when the pandemic shuttered its factories. The pandemic created ripple effects that negatively affected the design of new products, as communication between design and manufacturing teams was also disrupted.

Other companies are suffering similar versions of Apple's woes. Many pharmaceutical companies rely on China as the sole source of key ingredients for their drugs. Also, more than 60 medical manufactuers have facilities in China dedicated to essential medical devices. In fact, at least 200 of the Fortune Global 500 had a presence in Wuhan, China.

What appears to be sound strategy can risk a total collapse when uncertain events disrupt locations that companies heavily rely on to obtain efficiency gains. The current concerns with drug shortage risks due to the pandemic remind us that when uncertain events break down global supply chains, the entire economy can suffer.

The researchers said if branching had been put into practice, these companies would have facilities in other countries, thereby preventing financial losses. The researchers also said branching may reduce the value that companies get now, but it may sustain their value for a longer time. There's a trade-off between efficiency and flexibility. It's the difference between a somewhat higher cost for making a phone versus being able to make it at all.

"When you design your operations to optimize economies of scale, you've accounted for standard issues that could go wrong," Polidoro said. "Then a nonstandard event occurs that you have not accounted for, like a pandemic, a trade war or closing national borders. All of a sudden, the decisions that optimized your operations may lead to unprecedented disruptions."

Credit: 
University of Texas at Austin

Largest ever study of radiosurgery for brain metastases from small cell lung cancer

image: Chad Rusthoven, MD, and colleagues report in JAMA Oncology no survival benefit for whole-brain radiation compared with radiosurgery in small cell lung cancer

Image: 
University of Colorado Cancer Center

The international First-line Radiosurgery for Small-Cell Lung Cancer (FIRE-SCLC) analysis led by University of Colorado Cancer Center researchers and published today in JAMA Oncology details clinical outcomes for 710 patients with brain metastases from small cell lung cancer treated with first-line stereotactic radiosurgery (SRS), without prior treatment with whole-brain radiation (WBRT) or prophylactic cranial irradiation (PCI).

The study represented a substantial research effort including international collaborators from 28 individual centers and one prospective clinical trial from Asia, North America, and Europe. Following first-line SRS, the outcomes were encouraging overall with a median time to brain progression of 8.1 months and a median overall survival of 8.5 months.

The investigators also compared these SRS results with a control group of 219 patients treated with first-line WBRT for brain metastases, which is the current standard of care for small cell lung cancer. Importantly, no overall survival benefit was observed with WBRT compared to SRS. In fact, the survival outcomes were slightly better with SRS even after matching for baseline characteristics. The authors were careful to note, however, that the observed differences in survival in favor of SRS could be related to uncontrolled treatment selection factors in the setting of a retrospective analysis.

"As expected, whole brain radiation was superior to focused treatment with radiosurgery in lengthening the time to disease progression in the brain. However, the improvement in brain control with whole brain radiation did not appear to translate into an improvement in overall survival," says Chad Rusthoven, MD, assistant professor in Radiation Oncology at the University of Colorado Cancer Center, the paper's lead author.

The study, with senior author Tyler Robin, MD, is the largest analysis of outcomes with first-line SRS for brain metastases from small cell lung cancer, offering important descriptive and comparative data on this potential treatment paradigm.

"Although SRS has become the preferred treatment strategy for limited numbers of brain metastases arising from many cancer types due to improved quality of life and cognitive preservation compared to WBRT, small cell lung cancer remains an important exception where WBRT has remained the standard of care for limited and even solitary brain metastases. The primary reason for this is that small cell lung cancer patients where excluded from the randomized trials that established SRS," Robin says.

Because small cell lung cancer patients were excluded from the landmark prospective trials evaluating SRS, understanding of SRS for small cell lung cancer has lagged behind other cancers, including non-small cell lung cancer.

"Small cell lung cancer is known to have an increased propensity for spread to the brain compared to many other cancers. Historical caution regarding first-line SRS for small cell lung cancer has generally been related to concerns that omission of WBRT could result in rapid disease progression and decreased survival times. Thus, it is an important observation that, in this large international study, the omission of WBRT in favor of first-line SRS did not result in diminished overall survival," Rusthoven says.

This analysis, which may represent the strongest data reported thus far in support of first-line SRS for small cell lung cancer, comes at a dynamic time in the evolution of small cell lung cancer management.

"Paradigms for the treatment of small cell lung cancer are evolving. In recent years, we have seen the integration of immunotherapy into small cell lung cancer management, a decrease in the administration of WBRT, and national guideline updates recommending routine brain MRI surveillance for all patients. These changes may be expected to increase the identification of small cell lung cancer patients with limited brain metastases who may be candidates for first-line SRS," Robin says.

The study also provided detailed analyses of outcomes with SRS by the number of brain lesions treated. Patients treated with SRS for a single brain metastasis experienced the best brain control and overall survival outcomes. After that, the clinical outcomes for patients with 2-4 vs 5-10 brain metastases where very similar, whereas patients with 11 or more metastases were seen to have the shortest time to brain progression and overall survival.

The authors note that prospective trials evaluating the role of first line SRS for small cell lung cancer patients are needed to confirm the encouraging results observed in this retrospective study. In the meantime, this large international analysis provides important descriptive and comparative data on first-line SRS as a potential emerging treatment option for brain metastases in carefully selected small cell lung cancer patients.

Credit: 
University of Colorado Anschutz Medical Campus

Widely available indigestion drug may curb COVID-19 symptoms in mild to moderate disease

A widely available and inexpensive drug that is used to ease the symptoms of indigestion may prove a worthy contender for treating COVID-19 infection in those whose disease doesn't require admission to hospital, suggest the findings of a small case series, published online in the journal Gut.

The effects were felt within 24 to 48 hours of taking famotidine, and a rigorous clinical trial is now warranted to see if the drug could be an effective treatment for COVID-19, say the researchers.

Famotidine (Pepcid AC) belongs to a class of drugs known as histamine-2 receptor antagonists, which reduce the amount of stomach acid produced. Famotidine can be taken in doses of 20-160 mg, up to four times a day, for the treatment of acid reflux and heartburn.

The researchers report on 10 people (6 men; 4 women) who developed COVID-19 infection, all of whom happened to have been taking famotidine during their illness.

The severity of five cardinal symptoms--cough; shortness of breath; fatigue; headache and loss of taste/smell as well as general unwellness--was measured using a version of a 4-point scale normally applied to assess the severity of cancer symptoms (ECOG PS).

Seven of the patients tested positive for COVID-19, using a swab test; two had antibodies to the infection; and one patient wasn't tested but was diagnosed with the infection by a doctor.

Their ages ranged from 23 to 71 and they had a diverse range of ethnic backgrounds and known risk factors for COVID-19 severity, including high blood pressure and obesity.

All started taking famotidine when they were feeling very poorly with COVID-19, the symptoms of which had been going on from 2 up to 26 days at that point.

The most frequently used dose was 80 mg taken three times a day, with the average treatment period lasting 11 days, but ranging from 5 to 21 days.

All 10 patients said that symptoms quickly improved within 24-48 hours of starting famotidine and had mostly cleared up after 14 days.

Improvement was evident across all symptom categories assessed, but respiratory symptoms, such as cough and shortness of breath, improved more rapidly than systemic symptoms, such as fatigue.

Seven of the patients didn't experience any side effects while on famotidine, and in the three who did, these were mild, and all but temporary forgetfulness were known side effects associated with taking the drug.

While promising, the researchers point out that the findings might have been affected by 'the placebo effect,' and/or hazy recall, added to which the number of case study participants was small.

"Our case series suggests, but does not establish, a benefit from famotidine treatment in outpatients with COVID-19," they caution. And it's not clear how famotidine might work: if it might incapacitate the virus in some way or alter a person's immune response to it.

"Clinically, we unreservedly share the opinion that well designed and informative studies of efficacy are required to evaluate candidate medications for COVID-19 as for other diseases," they emphasise.

Nevertheless, they suggest their findings warrant further more detailed study, adding that a clinical trial, testing the combination of famotidine with the antimalarial drug hydroxychloroquine in patients admitted to hospital with COVID-19, is already under way.

"An outpatient study of oral famotidine that investigates efficacy for symptom control, viral burden and disease outcome and assesses the effects of medication use on long term immunity should be considered to establish if famotidine may be of use in controlling COVID-19 in individual patients while also reducing the risk of SARS-CoV-2 transmission," they conclude.

Credit: 
BMJ Group

Applying symptom tracking to COVID-19 outpatient care using famotidine

A patient-reported symptom tracking method used for patients with cancer has now been adapted for patients with COVID-19. Investigating the effect of famotidine, a potential treatment for COVID-19, on non-hospitalized patients with COVID-19, clinicians at Northwell Health and cancer researchers at Cold Spring Harbor Laboratory (CSHL) developed the method to use in addition to laboratory tests. This outpatient approach addresses the need to care for the majority of COVID-19 patients who do not require hospitalization. The first clinical case series showed that famotidine may help patients with mild to moderately severe symptoms from COVID-19. Next, the team will test the drug in a randomized clinical trial.

Published in the journal GUT, the Northwell-CSHL case series is unique in its adaptation of quantitative tracking of patient-reported outcome measures. The methodology is suitable for testing drugs in patients well enough to be managed at home and allows the recruitment of diverse subjects via community-based health organizations and individual practitioners.

The lead author of the study, CSHL Assistant Professor Tobias Janowitz, is a Medical Oncologist and a Cancer Researcher, who investigates the whole-body causes and effects of diseases. "The experience of a patient at one point in time is very valuable, but learning about the change in their experience over time is even more important," says Janowitz. "Change indicates if the patients' condition is getting better or worse. A graded symptom score enables the physician and the patient to track symptoms using numbers."

Observing that for COVID-19, most symptomatic people do not require hospitalization, Janowitz and colleagues developed a 4-point scale for six common COVID-19 associated symptoms that patients score every day. Janowitz simplifies how the scale can help track the course of a patient's disease:

"You may call up your doctor and say, I have headaches and shortness of breath, and am only able to do the basics for self-care, which would be grade 3 symptoms. If you still had the symptoms two days later, but are now able to do light work, these symptoms would now be scored at grade 2. This approach makes it very easy for you and your doctor to document that your symptoms are improving. The value of this approach from a research perspective is that experiences from many patients become comparable and can be pooled for analysis."

If a drug speeds recovery, then most patients will report more rapid improvement of symptoms.

The innovations in this study are the product of scientists and physicians who never expected to work on a pandemic infection. But the collaborative, multidisciplinary approach is a hallmark of the strategic research affiliation between CSHL and the academic physicians at Northwell Health. Chief, General Internal Medicine, Northwell Health and Professor at the Feinstein Institutes, and a co-author on the study, Joseph Conigliaro says:

"There are a lot of things that we were doing that worked well in how we address this pandemic. I'm a health services addiction researcher. I didn't think I'd ever be the person who would be studying this pandemic and this virus. And Tobias does cancer research. What we did is we used our existing tools and talents that were not specifically geared towards studying a viral pandemic, but we used it and modified it in ways that were very, very helpful."

CSHL Professor David Tuveson, Director of the CSHL Cancer Center and another co-author, is interested in the broader implications of the patient-oriented research approach as a way to better connect science and healthcare. CSHL's commitment to bridging the gap between research and the clinic is evident in an initiative championed by Janowitz to study how disease impacts the entire body. Tuveson says:

"You're trying to scientifically assess a symptom, is what you're really doing. Tobias and his colleagues can look at the whole body while they study cancer. And so Tobias is someone who thinks about the big picture of health. He's basically shifting his approach to solving complex problems to COVID-19 and he started by saying 'how can we describe one patient relative to the next?'"

Janowitz agrees, "It seems reasonable to me to make that extension to this healthcare crisis and to acknowledge that we can learn a lot from the individual who has the disease as long as we find a way to make it quantifiable."

Once validated, the patient-reported symptom tracking method will be a key component in a clinical trial that is "double-blind," meaning that neither patient nor doctor know whether the patient is getting the test drug or a placebo until the trial is completed. Without a double-blind clinical trial and a careful scientific examination of the mechanism of action of a drug and its side effects, it is impossible to rule out that the preliminary findings were due to factors other than the drug. The early findings of this case series were communicated to co-author Dr. Timothy Wang at Columbia University Medical Center. Interestingly, Dr. Wang and colleagues were also able to find an association between famotidine usage and the improved outcomes of patients hospitalized for COVID-19. Based on the findings in the case series, a double-blind clinical trial of famotidine is the next step in the joint plan of the Northwell-CSHL team. Other treatments may also be explored.

Conigliaro explains why he is hopeful that the study will work as designed to find drugs that will be effective against COVID-19:

"We had about a thousand patients that we tracked in our practices, just in my couple of academic practices, that never made it to the hospital. And my Doc's were calling them every day and asking all those questions: 'Do you have aches and pains? Do you have fever today? How's your breathing today?' We didn't know what their blood levels were. We didn't know what their oxygen levels, because we didn't have access to that. So these surrogate measures are very common and the best way to assess what's going on in the outpatient. I'm actually pretty confident. I think there'll be another surge but I think we'll be much better prepared and I think that hopefully lives will be saved."

Credit: 
Cold Spring Harbor Laboratory

Use loss of taste and smell as key screening tool for COVID-19, researchers urge

King's College London researchers have called for the immediate use of additional COVID-19 symptoms to detect new cases, reduce infections and save lives.

In a letter published in The Lancet, the team discussed how loss of taste and smell - anosmia - should form part of screening measures for the virus.

They said: "As countries slowly emerge from lockdown measures, it is imperative to correctly contact-trace infected individuals. We believe that having added loss of smell and taste to the list of COVID-19 symptoms is of great value as it will help tracing almost 16% of cases that otherwise would have been missed."

The team, led by Professor Tim Spector, previously reported that loss of smell and taste is a key predictor of COVID-19 in addition to the most established symptoms of a high temperature and a new, continuous cough. The relative importance of the extra symptom was however disputed by sections of the UK government when it announced it was including the extra symptom. This additional analysis of the COVID Symptom Study app data and its 3.7 million users sought to quantify the clinical value of recording loss of smell in the population.

From 76,260 people with symptoms who tested positive for COVID-19 up to 19 May, 28.5% never reported any fever or cough and 16% reported loss of smell but not fever or cough. The prevalence of loss of smell and taste was three-fold higher in individuals testing positive (65%) than in those testing negative (22%), the strongest single predictor of being infected, suggesting that people with loss of smell and taste should self-isolate for at least seven days or until they can be tested.

Professor Tim Spector from King's College London said: "We believe that loss of smell and taste is a very common COVID-19 symptom and in fact, occurs more often than fever and lasts longer (5 days on average compared to only 2 for fever). Infections could be reduced, and lives saved now that this non-flu-like symptom is widely recognised, and actions are taken."

The researchers suggest that policymakers should consider these findings and their implications for mass screening as part of other public health measures in key areas such as schools, hospitals, airports and care homes.

Professor Spector said: "Our data suggests that low-cost so-called 'smell the difference' screening tests, that are already being used in some workplaces to screen people has they enter buildings, would capture a larger number of positive cases than temperature sensors do. We therefore feel that it should form part of a wider public health approach to reducing the infection rate."

Credit: 
King's College London