Tech

Tiny device enables new record in super-fast quantum light detection

image: The integrated detector combines a silicon photonic chip with a silicon micro-electronics chip, yielding advanced speed in detecting quantum light

Image: 
University of Bristol

Bristol researchers have developed a tiny device that paves the way for higher performance quantum computers and quantum communications, making them significantly faster than the current state-of-the-art.

Researchers from the University of Bristol's Quantum Engineering Technology Labs (QET Labs) and Université Côte d'Azur have made a new miniaturized light detector to measure quantum features of light in more detail than ever before. The device, made from two silicon chips working together, was used to measure the unique properties of "squeezed" quantum light at record high speeds.

Harnessing unique properties of quantum physics promises novel routes to outperform the current state-of-the-art in computing, communication and measurement. Silicon photonics - where light is used as the carrier of information in silicon micro-chips - is an exciting avenue towards these next-generation technologies.

"Squeezed light is a quantum effect that is very useful. It can be used in quantum communications and quantum computers and has already been used by the LIGO and Virgo gravitational wave observatories to improve their sensitivity, helping to detect exotic astronomical events such as black hole mergers. So, improving the ways we can measure it can have a big impact," said Joel Tasker, co-lead author.

Measuring squeezed light requires detectors that are engineered for ultra-low electronic noise, in order to detect the weak quantum features of light. But such detectors have so far been limited in the speed of signals that can be measured - about one thousand million cycles per second.

"This has a direct impact on the processing speed of emerging information technologies such as optical computers and communications with very low levels of light. The higher the bandwidth of your detector, the faster you can perform calculations and transmit information," said co-lead author Jonathan Frazer.

The integrated detector has so far been clocked at an order of magnitude faster than the previous state of the art, and the team is working on refining the technology to go even faster.

The detector's footprint is less than a square millimeter - this small size enables the detector's high-speed performance. The detector is built out of silicon microelectronics and a silicon photonics chip.

Around the world, researchers have been exploring how to integrate quantum photonics onto a chip to demonstrate scalable manufacture.

"Much of the focus has been on the quantum part, but now we've begun integrating the interface between quantum photonics and electrical readout. This is needed for the whole quantum architecture to work efficiently. For homodyne detection, the chip-scale approach results in a device with a tiny footprint for mass-manufacture, and importantly it provides a boost in performance," said Professor Jonathan Matthews, who directed the project.

Credit: 
University of Bristol

HSS presents innovative research at 2020 ACR Annual Meeting

At this year's American College of Rheumatology virtual meeting, Hospital for Special Surgery (HSS) presented exciting research related to rheumatology and orthopedic surgery.

The research focuses on the diagnosis of renal disorders, the risk of venous thromboembolism after total knee replacement (TKR), and the care of pediatric and young adult patients with rheumatologic diseases. There are also studies related to the care of rheumatology patients during the COVID-19 pandemic.

"While there have been significant advances in science over the years, there are areas of unmet needs warranting continued investigation," said S. Louis Bridges, Jr., MD, PhD, Physician-in-Chief, Chair of the Department of Medicine and Chief of the Division of Rheumatology at HSS. "HSS has a relentless focus on improving patient care through innovation and discovery. By spearheading multi-center studies, clinical trials, and laboratory-based research, we continue to improve the quality of life for patients affected by rheumatic and musculoskeletal conditions."

What follows are some highlights from the meeting:

Geographical Variations in COVID-19 Perceptions and Patient Management: A National Survey of Rheumatologists

A study led by HSS rheumatologist Bella Mehta, MBBS, MS, looked at the perceptions and behaviors of rheumatologists in the United States regarding the risk of COVID-19 for patients with autoimmune disorders and how those patients were managed with immunosuppressive and anti-inflammatory medications. The 271 respondents were asked whether patients with rheumatic diseases were at higher risk of COVID-19 and whether the pandemic had led them to reduce the use, dosage, or frequency of biologics or steroids. Overall, the survey found geographical variations regarding perceptions of patients' risk of COVID-19, with responses that were significantly different in the Northeast region of the United States compared with other regions.

Can a Clinical Disease Activity Index Based on Patient-Reported Joint Counts (PT-CDAI) Be Used to Inform Target-Based Care in Telemedicine? An Analysis of 2 Early RA Cohort Studies

A team that included HSS rheumatologist Vivian P. Bykerk, BSc, MD, FRCPC, looked at data from two cohort studies on patients with early rheumatoid arthritis (RA) that compared patient and physician assessment of tender and swollen joints. This research is important because COVID-19 has forced rheumatologists to shift from in-person clinical visits to telemedicine, which limits their ability to carry out complete joint exams. The study found that patient and physician assessments had good agreement when it came to identifying active versus controlled disease activity. In addition, the researchers reported that predictors of higher discrepancies between patient and physician assessments may help identify patient subsets that could benefit the most from more physician-guided training at joint self-assessments as well as more probing questioning during telehealth visits to confirm active synovitis.

Characterization of Antiphospholipid Antibody-Associated Nephropathy: An International Survey of Renal Pathology Society Members

A survey led by HSS rheumatologist Medha Barbhaiya, MD, MPH, sought to determine whether pathologists worldwide use uniform criteria to distinguish antiphospholipid antibody-associated nephropathy (aPL-N) from other forms of thrombotic microangiopathy. The study was done in parallel with an international effort to develop new classification criteria for antiphospholipid syndrome. The web-based survey included 780 members of an international Renal Pathology Society; 111 renal pathologists responded, representing 33 countries. The participants were asked to determine whether two acute and eight chronic aPL-N features were consistent with acute or chronic aPL-N without and were not provided with serology information. Results from the survey indicated that more than 90% of pathologists worldwide agreed that the renal pathologic features most specific for aPL nephropathy are noninflammatory glomerular or small arterial microthrombi and organized microvascular thrombi with recanalization. In the absence of serologic data, more than 75% of pathologists indicated lack of specificity of chronic glomerular or small arterial changes. These findings indicate the importance of serologic test results in biopsy interpretation and suggest higher specificity for certain acute or chronic features.

The Risk of Venous Thromboembolism After Septic Total Knee Replacement (TKR) Revision Is Double the Risk After Aseptic TKR Revision: An Analysis of Administrative Discharge Data

A team led by HSS rheumatologist Anne R. Bass, MD, assessed the 90-day risk of postoperative venous thromboembolism (VTE) in people who underwent revision TKR surgery. Although it was already known that infection can increase the risk of thrombosis by triggering coagulation pathways, the impact of infection in this context was not well understood. The study, which included 25,441 of these surgeries performed between 1998 and 2014, used data from the New York Statewide Planning and Research Cooperative System. Overall, 69% of these revisions were for mechanical causes (aseptic), 28% were for infection (septic), and 3% were related to a periprosthetic fracture. The researchers found that the risk of VTE after septic revision surgery was double the risk after aseptic revision surgery, and was 2.6-fold higher after revision surgeries for fracture. These findings point to the importance of taking infection and fracture status into account when planning VTE prophylaxis for patients having revision surgery.

Using a Patient-Engaged Approach to Identify Cross-Cutting Disease Factors Impacting Mental Health in Youth with Rheumatologic Disease

A team including HSS chief of pediatric rheumatology Karen Brandt Onel, MD, examined the mental health experiences of those with juvenile arthritis, juvenile dermatomyositis, or systemic lupus erythematous. It included patients aged 14 to 24 and parents of patients between the ages of 8 and 24. The primary outcome was the presence of any clinical or self-diagnosed mental health problem. The survey included information on cross-cutting disease factors such as disease duration, active disease status, current steroid medication, history of disease flare following remission, and appearance-altering comorbidities (such as psoriasis, stretch marks, alopecia, skin ulceration, and visible scarring). The researchers also examined results by mental health problem (depression, anxiety, and self-harm/suicidal ideation). Overall, they found that certain rheumatologic disease factors, including those that alter appearance, are predictive of mental health problems such as depression or anxiety. The findings will help identify targets for mental health screening in youth with rheumatologic diseases.

New Juvenile Idiopathic Arthritis Quality Measure Set for the Pediatric Rheumatology Care and Outcomes Improvement Network

A team that included HSS pediatric rheumatologist Nancy Pan, MD, reported highlights of new quality measures for the treatment of juvenile idiopathic arthritis. The study also provided information about the performance of these new measures, which were created by the Pediatric Rheumatology Care and Outcomes Improvement Network. The measures take into consideration clinical health importance, scientific validity, and feasibility, as well as best practices. They include both clinical evaluations based in part on the clinical Juvenile Arthritis Disease Activity Score and patient-reported outcomes such as pain, physical function, and overall well-being. An analysis of the new measures found that they are feasible for use on patients with juvenile idiopathic arthritis and can help to maximize quality improvement efforts and optimize the delivery of care.

Credit: 
Hospital for Special Surgery

Water predictions: Telling when a nanolithography mold will break through droplets

image: Measuring how a drop of water makes contact with the grooves of a nanolithography mold to find out how worn out the mold is.

Image: 
Jun Taniguchi, Tokyo University of Science

Ultraviolet nanoimprint lithography (UV-NIL) is a manufacturing technique for producing nanostructures using UV-curable resin. One of its main advantages is its sheer simplicity; UV-NIL essentially consists of pouring a liquid resin over a nanostructured mold, making the resin solidify using UV irradiation, and then releasing it from the mold. The result is a solid polymer with a nanostructure that is the inverse of that of the mold. Using this technique, a great variety of functional devices and thin films can be made for applications in fields such as optics, electronics, healthcare, biology research, and solar cells, to name just but a few.

It is safe to say that the most valuable piece of equipment in the whole UV-NIL process is the master mold. To make them last longer, replica molds made from the master mold are made and used for the mass production of nanostructured devices. However, these replica molds are not as durable as the master mold, and they generally begin to show macroscopic signs of wear after a couple thousand imprints. Unfortunately, no reliable and standard method to predict the lifetime of replica molds exists--yet.

In a recent study published in Nanomaterials, researchers from Tokyo University of Science and Autex Inc., Japan, came up with a clever strategy to tell when a replica mold will break. They managed to do this by looking at water droplets placed on top of a mold with a pattern of spaced parallel lines and noting how the adherence of the droplets changes as the mold wears out and distorts with use.

But how does such a method work? Liquid droplets resting on a surface have a measurable property called contact angle, which is the angle at which the liquid meets the solid at the interface. This angle is easily measurable using commercial camera-based systems.

Now, in a nanostructured mold with parallel lines (grooves), the contact angle of water exhibits anisotropy, meaning that it varies along different directions. More specifically, a water droplet placed on a pristine mold with parallel grooves will spread along the direction of the grooves more than in the direction perpendicular to the grooves. However, the researchers noticed that, as the mold wears out with repeated use, the contact angle changes differently along each direction. Professor Jun Taniguchi from Tokyo University of Science explains: "As the number of uses of the mold increases, the contact angle along the direction parallel to the grooves decreases linearly. In contrast, the contact angle along the perpendicular direction decreases much more quickly at first and then stays at a constant minimum value. We found that molds tended to become defective almost exactly when the number of repetitions made the contact angles in both directions equal."

The researchers verified their method using molds with different groove widths, and the results held up (see the accompanying Figure). This means that one can make accurate predictions about the remaining lifetime of a mold with parallel grooves through a simple process. First, check that the contact angle in the perpendicular direction has reached a stable minimum value. Then, compute the decreasing line function with the stored values for the contact angle along the parallel direction. Finally, calculate the number of uses after which this line will intersect the minimum value; the mold will most likely break around that number.

This simple approach will be useful to determine the expected lifetime of molds for UV-NIL. "Though our method is only applicable to molds with parallel grooves, it can be used to assess the durability of mold materials themselves; the predicted lifetime of a mold material applies to virtually any shape," remarks Mr Shin Hiwasa from Autex Inc. Overall, this study fills an important knowledge gap in UV-NIL, reducing its associated costs by providing an easy way to predict when molds will wear out.

The proposed strategy will hopefully make this nanoimprinting technique more accessible and reliable, fostering its adoption both in research and applications for everyday life.

Credit: 
Tokyo University of Science

How ancient dust from the sea floor helps to explain climate history

image: 18 sediment cores from the seabed were brought on board the research vessel "Polarstern" by means of plungers and gravity sounders.

Image: 
Katharina Pahnke / University of Oldenburg

During the last Ice Age about 20,000 years ago, iron-containing dust acted as a fertilizer for marine phytoplankton in the South Pacific, promoting CO2 sequestration and thus the glacial cooling of the Earth. But where did the dust come from? Researchers led by Dr. Torben Struve, geoscientist at the University of Oldenburg, Germany, have investigated this open question of climate history, which is also relevant with respect to current climate change.

Using sediment cores from the sea floor, they found that a large part of the dust deposited in the southern South Pacific at that time had travelled an extremely long way. Up to 80 percent of the dust came from what is now north-west Argentina, from where it was transported almost completely around the globe by the prevailing westerly winds. After a voyage of up to 20,000 kilometres, it contributed significantly to the increased input of iron into the glacial South Pacific. The dust input from Australia, which dominates in the South Pacific today, played only a minor role. The research team has published these new insights into the mechanisms of natural iron input into the Southern Ocean in the journal Nature Communications.

"We have analysed the chemical fingerprint of the dust and compared it with geological data from several continents. This was laborious work, like a jigsaw puzzle," says Struve, a post-doctoral scientist in the research group "Marine Isotope Geochemistry" at the University's Institute for Chemistry and Biology of the Marine Environment (ICBM). The team included researchers from his group as well as colleagues from the Alfred Wegener Institute - Helmholtz Centre for Polar and Marine Research, Bremerhaven (Germany), and from Columbia University, New York (USA).

The researchers sampled 18 sediment cores from the South Pacific between Antarctica, New Zealand and Chile, a study area which is roughly the size of Russia. Subsequently, they investigated the chemical composition of the dust contained in the samples. "This dust ultimately stems from rock, which has characteristic properties depending on its place of origin and geologic history so that each source has its own signature," Struve explains.

The researchers focused on trace metals, in particular rare earth elements and specific isotopes, that is variants of different weight, of the elements neodymium, lead and strontium. This signature is preserved over millions of years and thus provides reliable information about the origin of rock particles even after 20,000 years.

At that time, the last Ice Age was at its peak. According to the results, westerly winds blew dust particles from the eastern side of the central Andes in South America across the Atlantic and the Indian Ocean. As such, the iron-bearing dust was transported once around the globe before being deposited in the middle latitudes of the South Pacific. Since algae in these waters usually lack iron as a crucial nutrient for growth, iron-containing dust acts as a natural fertiliser until today.

Like all plants, phytoplankton - microscopic algae - absorbs carbon by means of photosynthesis and thus reduces the proportion of carbon dioxide (CO2) in the atmosphere. According to Struve, the greatly increased input of iron-bearing mineral dust into this marine region, primarily from South America, could help to explain "how the Earth could have become so cold at all at that time".

It was already known that the iron input during the last ice age was much higher than during the present warm period. "But we were surprised to find that the sources and transport routes of the dust were completely different from today and also different from what we would have expected."

The research team concludes that the unusually high dust emissions from South America must have made a significant contribution to the reduction of CO2 in the atmosphere of the Ice Age. The input of iron-bearing mineral dust reduced the CO2 level of the atmosphere by up to 40 ppm ("parts per million"). This corresponds to almost half of the natural CO2 variation in the atmosphere over the last 400,000 years amounting to 100 ppm. To put this into perspective, since the beginning of industrialisation, anthropogenic emissions have increased the CO2 level from around 280 to around 415 ppm.

Today, no dust from South America can be detected in the study area. "Global warming has changed the winds and environmental conditions in the source regions," Struve says, who continues to study the sediment cores. Together with his colleagues, he wants to find out how the composition of the dust has changed since the peak of the ice age and how this may have contributed to climate change.

Credit: 
University of Oldenburg

New method developed by Lithuanian scientists can reach 90% accuracy in detecting melanoma

image: The team lead by Prof Raisutis developed a method, which combines diagnostic information obtained from different non-invasive imaging technologies

Image: 
KTU

A team of researchers from Kaunas University of Technology and Lithuanian University of Health Sciences proposed a non-invasive method for detection of melanoma. A patented computer-aided diagnostic system developed by Lithuanian scientists proved to be more than 90% accurate in detecting malignancy in diagnostic images of skin lesions acquired from 100 patients.

In Europe, melanoma is the fifth most common type of cancer and is the major cause of death from skin cancer. Northern Europe displays the largest age-standardised rate mortality of 3.8 per 10,000 in the region, with an incidence of 23.4.

Excision of a primary tumour remains essential in diagnosing melanoma, and the decision for the operation is generally based on the dermatoscopic evaluation of the lesion. However, the accuracy of melanoma clinical diagnosis is only at 65% and strongly relies on the experience of the physician-dermatologist carrying out the analysis.

"The novelty of our method is that it combines diagnostic information obtained from different non-invasive imaging technologies such as optical spectrophotometry and ultrasound. Based on the results of our research, we can confirm that the developed automated system can complement the non-invasive diagnostic methods currently applied in the medical practice by efficiently differentiating melanoma from a melanocytic mole", says Professor Renaldas Raisutis, head of the team behind the research at Kaunas University of Technology.

In the study, carried out by the researchers of two Lithuanian universities, diagnostic images of skin lesions acquired from 100 different patients were evaluated. By comparing and analysing complex data on skin tumours recorded by different techniques, the researchers were able to develop a novel diagnostic system, differentiating melanoma from a benign nevus with accuracy higher than 90%.

"An efficient diagnosis of an early-stage malignant skin tumour could save critical time: more patients could be examined and more of them could be saved", says Prof Raisutis.

According to Prof Raisutis, the novel diagnostic system is firstly aimed at medical professionals; he estimates that the price of the final product might be affordable even for smaller medical institutions. However, the team is also thinking about the solutions for individual use at home.

Following the research, the prototype of the technology was developed, and the clinical research is being carried out complying with the requirements of the protocol for the clinical research confirmed by the regional bioethics committee.

The invention was patented in Lithuania (Patent No. LT6670B) and the patent applications filed for Patent Cooperation Treaty and the United States Patent and Trademark Office.

The above-described method and technology were developed within the framework of the project "SkinImageFusion: Complex Analysis Method of Spectrophotometric and Ultrasound Data for the Automatic Differential Diagnosis of Early Stage Malignant Skin Tumours", funded by Lithuanian Research Council and carried out by a joint team of Kaunas University of Technology and Lithuanian University of Health Sciences (the latter headed by Professor Skaidra Valiukeviciene) in 2017-2020. Three doctoral dissertations were defended on the basis of this research.

Credit: 
Kaunas University of Technology

Plant inspired: Printing self-folding paper structures for future mechatronics

image: The cartridges of an inkjet printer were filled with aqueous LiCl solutions at different concentrations. Applying these solutions to paper causes it to spontaneously fold due to the relaxation and reorganization of cellulose fibers. Different concentrations result in different folding times and a changed folding start.

Image: 
Hiroki Shigemune in "Programming Stepwise Motility into a Sheet of Paper Using Inkjet Printing" published in Advanced Intelligent Systems of Wiley Online Library, under Creative Commons license CC BY 4.0

When natural motion comes to mind, plants are most likely at the bottom of most people's list. The truth is that plants can perform complex movements, but they only do so very slowly. The main mechanism behind plant movement is water absorption and release; the cellulose present in plant tissues draws water in and expands, and the underlying arrangement of cellulose fibers guides the motions as needed. Now, what if we drew ideas from this natural phenomenon and used them for future engineering applications?

Surprisingly, it turns out that this type of motion could become the basis to produce new types of robots and mechatronic devices. In a recent study published in Advanced Intelligent Systems, a team of scientists from Shibaura Institute of Technology (SIT) and Waseda University, Japan, developed a most simple methodology based on this nature-derived concept to make paper fold itself as desired using nothing but a standard inkjet printer. Dr. Hiroki Shigemune from SIT, lead scientist on the study, explains their motivation: "Printing technologies to produce objects rapidly are currently in the spotlight, such as 3D printing. However, printing functional mechatronic devices remains a huge challenge; we tackled this by finding a convenient method to print self-folding paper structures. Since paper is mostly cellulose, we drew inspiration from plants."

The researchers first filled the ink cartridges of a regular printer with different aqueous solutions and printed simple straight lines across papers, causing them to fold by themselves (Figure 1). After settling on the best solute (LiCl, lithium chloride), they explored how different concentrations caused different self-folding times. To this end, they filled the four ink cartridges on the printer--for black, magenta, yellow, and cyan inks--with aqueous LiCl solutions at four different concentrations. Whereas pure water caused an almost immediate folding reaction, a high concentration of LiCl caused a slightly changed folding start. In turn, a relatively lower concentration caused a huge change in folding start and even slower folding times.

With the self-folding mechanism in place, all the scientists had to do was give the printer a document to print containing lines of the corresponding colors depending on the desired folding sequence. This simple approach allowed them to produce various self-folding origami structures, including a traditional paper plane and a small paper ladder that positions, hangs, and then retracts itself. "Our technology can be used to very easily produce flexible, stretchable, deployable, and crushable origami structures by simply designing an appropriate printing pattern," remarks Dr. Shingo Maeda from SIT.

Now, the research team will focus on using this novel self-folding method to develop mechatronic devices. To do this, they will combine this approach with a previous technique they themselves had developed; a way to print electrical wiring onto paper also using a standard printer. "By merging these two technologies, we will realize a rapid yet simple fabrication procedure for mechatronic elements and paper robots," says Dr. Shuji Hashimoto from Waseda University, "It would find applications in the space, healthcare, and agriculture fields, where tailor-made and disposable intelligent devices are needed."

It is also worth noting that the self-folding process requires no external energy sources nor complex machinery of any type, making it an attractive environment-friendly option to realize truly plant-inspired motion. Dr. Hideyuki Sawada from Waseda University remarks that they succeeded in achieving 'posteriori phototropism' in a silicone rod. In other words, they managed to recreate the natural movement of some plants, such as sunflowers, when they selectively face towards the sun or other light sources. "Using this technique, we could design more efficient solar panels by ensuring they face the sun at all times," he explains.

This study showcases how we can draw inspiration from nature to develop revolutionary technology, even with commonplace devices such as an inkjet printer!

Credit: 
Shibaura Institute of Technology

The use of videos in education could improve student pass rates

The use of digital videos in learning processes is becoming increasingly widespread, particularly within the educational context created by the COVID-19 pandemic. In a situation where moving education online is the only recourse, carrying out research into the impacts of using this kind of audiovisual material becomes imperative. A UOC team has studied the way digital videos are perceived by groups of engineering students when applied to various introductory physics courses, both at a fully online university, such as the UOC, and a traditional bricks-and-mortar university, such as the Escola Universitària Salesiana de Sarrià (EUSS). The results show that audiovisual materials are well received by students, who find them extremely useful, and highlight videos' potential for increasing academic success by generating higher pass rates.

The study, published in the journal Multimedia Tools and Applications, was conducted by Antoni Pérez-Navarro, Jordi Conesa and Víctor García. Pérez-Navarro and Conesa are both professors at the UOC's Faculty of Computer Science, Multimedia and Telecommunications. The former is a member of the Internet Interdisciplinary Institute's (IN3) Internet Computing & Systems Optimization (ICSO) research group while the latter is a member of SmartLearn. García is student on the doctoral programme in Education and ICT (E-learning).

As Pérez-Navarro explained: "Video creation is a time-consuming activity for teaching staff, few of whom are video production experts. It takes hours of work to produce just a few minutes of video, which is why it's so important to be able to gain an understanding of the impact generated by this kind of resource, as well as to gauge its reception and identify the types of video students find most useful."

Video use enhances student satisfaction ratings

Having analysed the course feedback provided by 125 students in relation to the videos they had watched during the course and looking at the correlation between the use of these materials and academic results, the researchers found that "the overwhelming conclusion was that the students liked the videos and felt they were useful for conveying both theoretical content and explanations to problems. That was true regardless of whether the learning environment was online or on-site."

The study also found that the inclusion of videos within courses had a positive impact on academic results, increasing the likelihood of students obtaining a pass. Pérez-Navarro, who was responsible for producing the hundred videos used during the experiment, pointed out: "The use of videos during the course led to better course progression as well as a higher pass rate. The students also found the videos useful in helping to assimilate the knowledge which in turn increased their level of satisfaction with regard to the course."

In search of the ideal format

Participants were given a variety of different video formats to watch throughout the course, most of which were filmed using applications to capture digital tablet notation or which showed the hands of the professor as they talked the students through the relevant lesson or problem. The students were also given videos of the professors themselves giving a lecture. According to the results of the new study, students prefer videos which feature their professor as a visual reference, either with a head and shoulders shot or showing their hands accompanied by a voiceover. "The priority for students appears to be being able to access video content, with the format of lesser concern, although more than half of the students surveyed expressed a preference for the hand/narration model," explained Antoni Pérez-Navarro, going on to emphasize the importance of taking practical considerations into account in relation to the creative process when choosing the best format to use. "When I started out it was taking me eight or nine hours to make a 10-minute video, which is obviously not feasible. Our goal, therefore, is not only to determine which types of video are the most useful to students but also which are easiest for teachers to create."

Thus, after experimenting with different formats, the researchers opted to record their hands while explaining the subject matter, as this directs the viewers gaze without the need for editing the text with arrows or other graphic elements. According to the expert: "When we see a hand pointing to something we usually look in that direction, that's their magic power. They are also extremely easy videos to make and are less likely to prompt an emotional response than other alternatives. When we put a professor on the screen, for example, we realized that it was more difficult to get the lighting right and that there were potential emotional implications: someone may respond to that person in a negative way or take exception to the way they look or their body language."

Short, well-planned videos

Video has served as a key resource during the lockdown that forced on-site classes to move into the digital realm, albeit, for the most part, without a great deal of forethought. According to the researcher, many professors found themselves giving "online classes without having had any prior experience, sometimes lacking the necessary resources, and with a tendency to opt for recording their lectures directly. This has resulted in excessively long videos where there is no plan in terms of the best form of delivery for this product."

"One very common mistake is to think, and I'm quoting what someone actually said, 'if they can cope with listening to me talking for two hours in class, they can cope with watching a two-hour video'. The reality is that this is a different environment and medium and, in general, short videos with a running time of a few minutes have greater impact. You have to think that the average feature film is around two hours long and that involves professional screenwriters, professional actors, with music, special effects, etc. And even then, sometimes we lose interest along the way."

Planning is the key to condensing all the relevant information into a video, with the content of each video carefully thought out and care taken over the technical aspects. "Lighting is very important, but sound is key. You have to be able to hear everything perfectly, otherwise it will be very difficult to follow."

Another important element when it comes to approaching a project like this is knowing yourself, knowing your own strengths and limitations and, above all, to practice. "The first videos you make are usually awful. I remember when I started making this type of resource back in 2007, it was taking me four to five hours to make just two minutes of video. It's a learning curve for everyone, but if we are familiar with the format, the materials or the environment being used, that curve will be more gentle and the gentler the curve, the more quality videos you will be able to make," advised the researcher.

According to Pérez-Navarro, it's not about the technology: "You can make 19th-century classes on 21st-century technology." It is more about empathising with your students, making sure you take their needs and opinions into account: "You have to listen to the students; they are the best critics. If we ignore them and they don't watch the videos, then all our hard work has been for nothing."

Credit: 
Universitat Oberta de Catalunya (UOC)

One third of UK fruit and vegetables are imported from climate-vulnerable countries

The UK's supply of fruit and vegetables has become increasingly reliant on imports from countries vulnerable to climate change, according to a new study in Nature Food.

The research, led by the London School of Hygiene & Tropical Medicine (LSHTM), involved analysing open-source data on food trade from 1987-2013. They estimated that the domestic contribution to total fruit and vegetable supply in the UK has decreased from 42% in 1987 to 22% in 2013, while at the same time imports of fruit and vegetables from climate-vulnerable countries have increased from 20% in 1987 to 32% in 2013.

The team found that the variety of fruits and vegetables imported into the UK has increased, and that there have been major shifts in the types of fruits and vegetables supplied to the UK market: tropical fruits have become more popular, but the supply of traditional vegetables has significantly declined.

In 1987, 21 crops comprised the top 80% of total fruit and vegetables supplied to the UK, and this rose to 27 in 2000 and 34 in 2013. The supply of pineapples increased from 0.9% to 1.4% of overall fruit and vegetable supply, and bananas from 3% to 7.8%, over this period.

Cabbages declined from 7.5% in 1987 to 2.5% of overall fruit and vegetable supply, peas from 5.0% to 1.3% and carrots from 7.0% to 5.8%.

Given the projected trends in global climate change, the researchers say that increased reliance on fruit and vegetable imports from climate-vulnerable countries could have a negative impact on the availability, price and consumption of fruit and vegetables in the UK. The researchers used a range of indices to assess the vulnerability of countries to current and future climate change. The analysis suggests that the most affected groups are likely to be people in low-income households.

Fruit and vegetables are key components of healthy diets, but globally their consumption is well below current international dietary recommendations. Just 30% of adults and 18% of children eat the recommended five portions of fruit and/or vegetables per day in England. Fruit and vegetables also typically have lower environmental footprints than animal sourced food and this dual contribution to health and sustainability is becoming increasingly recognised.

Dr Pauline Scheelbeek from LSHTM's Centre on Climate Change & Planetary Health, who led the study, said: "The UK's current trade patterns and climate change means the supply of fruit and vegetables in the UK is not secure. The recognition that trade is a key component of food system resilience is therefore vital information for policymakers.

"The increased reliance on fruit and vegetable imports from climate-vulnerable countries will, if no adequate climate change adaptation measures are taken in the production countries, lead to fruit and vegetable supply problems in the UK and potentially affect price and consumption of such foods. This could be a major challenge in our efforts to promote higher fruit and vegetable consumption in the UK, both for health and environmental reasons."

Professor Alan Dangour, Director of the Centre on Climate Change & Planetary Health at LSHTM, said: "It is very clear from the underlying trends in food production and trade that the UK is increasingly reliant on climate-vulnerable countries for its supplies of fruit and vegetables. The government cannot ignore these trends or it will be failing in its primary duty to protect its people from future shocks. I call on the government to do more now to support national food production, build resilience into the national food system and ensure the supply of healthy and sustainable diets for all."

The research team say that the results are particularly important in the light of several government- led programmes, such as the UK's National Food Strategy, the National Determined Contributions of the UK, and the Obesity Strategy, as well as ongoing Brexit trade negotiations.

Dr Pauline Scheelbeek, said: "The implications of vulnerability of our trade strategy cuts across traditional policy silos such as diets, health, agriculture, economy and the environment. We need to rethink our trade strategy to reduce dependency on climate vulnerable countries, import responsibly and look into possibilities to enhance consumption of sustainably grown fruit and vegetables, including those produced in the UK."

The study is subject to some limitations. The openly available trade data relies on reporting from individual countries, which may vary in quality. The indices used to determine climate vulnerability are modelled estimates and determined at country level: the vulnerability of the specific locations of crops production may not be the same as the country average.

Credit: 
London School of Hygiene & Tropical Medicine

A more resistant material against microorganisms is created to restore cultural heritage

image: The study was performed by a research team at the University Research Institute into Fine Chemistry and Nanochemistry at the University of Cordoba and Seville's Institute of Natural Resources and Agrobiology of the Spanish National Research Council

Image: 
University of Cordoba

Solar radiation, rain, humidity and extreme temperatures. Cultural heritage is exposed to an array of external factors that deteriorate it over time. Among them, the most aggressive may well be microbial contamination, caused by an ample ecosystem of fungi, algae, bacteria and microscopic lichens that grow inside the pores of the materials the buildings are made of and they make these buildings less resistant to other external agents, speeding up the deterioration process over time.

When restoring historical monuments, it is important to use tough materials that can withstand these microorganisms. This task is complex, given that the materials used in these kinds of restorations must be in accordance with the original materials, made of plaster, lime mortar and stones such as limestone or marble. Cement and concrete, materials commonly used in the latest research, are ruled out as they are incompatible with materials such as lime mortar and could even worsen the problem.

A research team from the University Research Institute into Fine Chemistry and Nanochemistry at the University of Cordoba (the FQM 214 and FQM 175 groups) and Seville's Institute of Natural Resources and Agrobiology of the Spanish National Research Council (abbreviated to IRNAS-CSIC in Spanish) worked together to create a biocide additive, as in one that kills microorganisms, that can be incorporated into materials used to rebuild historic monuments and buildings.

"The materials that contain these kinds of chemical compounds are widely used in restoration but their effectiveness usually lasts for a brief amount of time -about two years - since the external agents, in addition to deteriorating the material, end up weakening its biocidal properties", explains Adrián Pastor, one of the researchers on the study which is part of his doctoral research for his thesis titled "New functional materials to decontaminate cultural heritage and urban habitats". The study has been performed under the guidance of Dr. Luis Sánchez and Dr. Ivana Pavlovic and with the participation of Dr. Manuel Cruz Yusta and Dr. Beatriz Gámiz (RNM 124).

In this research, the team tested hydraulic lime mortar to which they added carbendazim, a biocide compound generally widely used in paint, as it has low water solubility and is therefore more water resistant. In order to do so, they compared, on the one hand, the antimicrobial effectiveness of a lime mortar to which carbendazim was directly added and on the other hand, a lime mortar whose clay contained an anchored biocidal compound.

Both underwent several microbiological tests in order to test their ability to fight microorganisms and a leaching process, in which the soluble parts of a material are removed, simulating various rain cycles in a short amount of time.

"In the first microbiological test, we verified that the first mortar, to which we directly added carbendazim, had a somewhat greater biocidal capacity. However, after the leaching processes, we verified that the second mortar, that had carbendazim anchored to the clay, showed better results since the biocide compound was released more slowly and therefore, its effect is more long-lasting", explains Adrián Pastor.

This is a preliminary study that requires further research to get this material under study on the market, meaning a larger scale study, as well as studying the material's specific physical properties in order to verify that it complies with regulations regarding durability, adhesion and other properties.

Credit: 
University of Córdoba

RUDN University soil scientist: Deforestation affects the bacterial composition of the soil

image: A soil scientist from RUDN University studied the effect of forest conversion on the properties of the soil: its acidity, carbon and nitrogen resources, bacterial composition, and the activity of microorganisms. The study can help improve the methods of soil cultivation after deforestation, namely, select the best fertilizers, prevent erosion, slow down nutrient depletion, and balance the composition of the bacterial community.

Image: 
RUDN University

A soil scientist from RUDN University studied the effect of forest conversion on the properties of the soil: its acidity, carbon and nitrogen resources, bacterial composition, and the activity of microorganisms. The study can help improve the methods of soil cultivation after deforestation, namely, select the best fertilizers, prevent erosion, slow down nutrient depletion, and balance the composition of the bacterial community. The results of the study were published in the Forest Ecology and Management journal.

The demand for crop farming products grows constantly, and to satisfy it, more and more forests are converted into plantations. In these converted areas sustainable and diverse ecosystems are replaced with monocultures (crop species). Such changes in land utilization affect both the chemical content of the soil and its biological composition, that is, the structure of its microbial community. Until recently, studies had focused on either the former or the latter aspect of this process. A soil scientist from RUDN University was the first to conduct a comprehensive study and to find out how deforestation and changes in chemical factors caused by it affect the bacterial composition of the soil.

"The diversity of soil microorganisms doesn't necessarily reduce as a result of forest conversion. However, bacterial communities undergo massive transformations. The bacteria that used to dominate in forest soils can almost disappear after deforestation and planting of crops. The key factors in this process are soil acidity and carbon and nitrogen resources," says Yakov Kuzyakov, a Ph.D. in Biology, and the Head of the Center for Mathematical Modeling and Design of Sustainable Ecosystems at RUDN University.

His team compared soil samples taken from a forest and four plantations in Hunan Province in South-Eastern China. Five years before that the whole territory had been covered with a pristine forest. The scientists measured the acidity of the soil, as well as the levels of carbon and nitrogen in it. All these indicators are associated both with soil fertility and bacterial activity. Microorganisms play a role in the circulation of soil carbon and also 'fix' nitrogen, making it accessible for plants. It turned out that soil acidity reduces after deforestation, and the levels of organic nitrogen and carbon drop by 83%. According to the team, this may be due to the reduction of vegetative cover and soil erosion. However, to the team's surprise, bacterial diversity in plantation soils turned out to be 6.8% higher than in forest soils.

The scientists believe this might be due to the fertilization of plantations. Fertilizers contain a lot of nutrients, thus increasing microbial diversity. Moreover, cultivated soil is enriched in carbon and other substances that also support intensive bacterial growth. Reduced acidity might be another factor to promote microbial and especially bacterial diversity. Different bacteria turned out to dominate in forest and plantation soils. For example, deforestation created perfect conditions for photosynthesizing bacteria. They transform sunlight into energy, which is a much more difficult task in shady forests.

"We found out that changes in the bacterial composition of the soil are mainly due to soil acidity and the levels of organic carbon and nitrogen. Therefore, efficient soil management methods should be developed for monoculture plantations to improve fertilization, prevent soil erosion, slow down the depletion of nutrients, and support microbial activity after deforestation," added Yakov Kuzyakov.

Credit: 
RUDN University

Liver scarring relatively common among middle-aged adults

(Boston)-- A substantial minority of participants from the Framingham Heart Study, (nearly nine percent), had potentially clinically significant liver fibrosis (scarring). This the first study of this size and scale done in the United States.

"Before this study, we did not know how common asymptomatic liver fibrosis (scarring) was among adults living in the community," said corresponding author Michelle T. Long, MD, Msc, assistant professor of medicine at Boston University School of Medicine (BUSM).

More than 3,000 middle-aged Framingham Heart Study participants (over a three-year period) underwent a test called a Fibroscan© or vibration-controlled transient elastography that quantifies how much fat is in the liver and also measures the stiffness of the liver. Liver stiffness correlates with the degree of liver scarring.

"We found that liver fibrosis was associated with more adverse cardiometabolic risk factors, even after accounting for liver fat which is a known risk factor for cardiometabolic disease. In particular, we observed that approximately one-quarter of the participants with diabetes had evidence of possibly clinically significant liver fibrosis," explained Long, who also is a hepatologist at Boston Medical Center.

According to the researchers, these findings support the consideration of screening for liver fibrosis in high-risk groups, though additional studies are needed to determine the benefits/costs of screening. "Liver biopsy is the gold standard for diagnosing liver fibrosis; however, new non-invasive tests exist that can quickly and painlessly help doctors determine if you are at risk for having clinically significant liver fibrosis."

If liver fibrosis is identified early, before cirrhosis is established, it is treatable. Long believes greater recognition of and awareness of liver fibrosis as a consequence of nonalcoholic fatty liver disease will hopefully allow more patients to receive treatment to prevent complications of advanced liver disease.

Credit: 
Boston University School of Medicine

Distinct slab interfaces found within mantle transition zone

image: Seismic observations (a) and a conceptual cartoon summarizing the origin of imaged slab interfaces (b)

Image: 
CHEN Qifu's group

Oceanic lithosphere descends into Earth's mantle as subducting slabs. Boundaries between the subducting slab and the surrounding mantle are defined as slab interfaces, whose seismic imaging is the key to understanding slab dynamics in the mantle. However, the existence of slab interfaces below 200 km remains elusive.

Prof. CHEN Qifu's group from the Institute of Geology and Geophysics, Chinese Academy of Sciences (IGGCAS) and their collaborators observed two distinct seismic discontinuities within the mantle transition zone (~410 km to 660 km) beneath the western Pacific.

The two discontinuities represented the upper and lower boundaries of the subducted Pacific high-velocity slab, corresponding to the slab Moho and the surface of partially molten sub-slab asthenosphere, respectively.

This work was published in Nature Geoscience on Nov. 9.

The subduction process transports chemically differentiated and hydrated rocks into Earth's mantle, driving the cycles of heat and material changes between Earth's surface and its deep interior.

At shallow depths (The slab interfaces can be seismologically detected at shallow depths. However, how deep the seismic velocity discontinuities at slab interfaces can extend remains unclear, mainly due to the lack of high-resolution imaging of slab interfaces at depths below 200 km.

To understand the existence and origin of deep slab interfaces, the researchers took advantages of the dense seismic arrays in northeast China to study the upper mantle structures in the region.

They found sharp-dipping, double seismic velocity discontinuities within the mantle transition zone (~410 km to 660 km) beneath the western Pacific that coincide spatially with the upper and lower bounds of the high-velocity slab.

"Based on detail seismological analyses, the upper discontinuity was interpreted to be the Moho discontinuity of the subducted slab," said Prof. CHEN. "The lower discontinuity is likely caused by partial melting of sub-slab asthenosphere under hydrous conditions in the seaward portion of the slab."

The imaged distinct slab-mantle boundaries at depths between 410 and 660 km, deeper than previously observed, suggest a compositionally layered slab and high-water contents beneath the slab.

Credit: 
Chinese Academy of Sciences Headquarters

Electrified magnets: researchers uncover a new way to handle data

The properties of synthesised magnets can be changed and controlled by charge currents as suggested by a study and simulations conducted by physicists at Martin Luther University Halle-Wittenberg (MLU) and Central South University in China. In the journal "Nature Communications", the team reports on how magnets and magnetic signals can be coupled more effectively and steered by electric fields. This could result in new, environmentally friendly concepts for efficient communication and data processing.

Magnets are used to store large amounts of data. They can also be employed in transmitting and processing signals, for example in spintronic devices. External magnetic fields are used to modify the data or the signals. This has few drawbacks. "Generating magnetic fields, for example with the help of a current-carrying coil, requires a lot of energy and is relatively slow," says Professor Jamal Berakdar from the Institute for Physics at MLU. Electric fields could help. "However, magnets react very weakly - if at all - to electrical fields, which is why it is so hard to control magnetically based data using electrical voltage," continues the researcher. Therefore, the team from Germany and China looked for a new way to enhance the response of magnetism to electrical fields. "We wanted to find out whether stacked magnetic layers reacted fundamentally differently to electrical fields," explains Berakdar. The idea: The layers could serve as data channels for magnetically based signals. If a metal layer, for example platinum, is inserted between two magnetic layers, the current flowing in it attenuates the magnetic signal in one layer but amplifies it in the other. Through detailed analysis and simulations, the team was able to show that this mechanism can be precisely controlled by tuning the voltage. This drives the current and allows for a precise and efficient electrical control of the magnetic signals. In addition, it can be implemented on a nanoscale, making it interesting for nanoelectronic applications.

The researchers went one step further in their work. They were able to show that the newly designed structure also responds more strongly to light or, more generally, to electromagnetic waves. This is important if electromagnetic waves are to be guided through magnetic layers or if these waves are to be used to control magnetic signals. "Another feature of our new concept is that this mechanism works for many material classes, as simulations under realistic conditions show," says Berakdar. The findings could thus help to develop energy-saving and efficient solutions for data transmission and processing.

Credit: 
Martin-Luther-Universität Halle-Wittenberg

A biomimetic membrane for desalinating seawater on an industrial scale

image: Artificial water channels inserted into a polyamide membrane

Image: 
Mihail Barboiu, Institut Européen des Membranes (CNRS/ENSC Montpellier/University of Montpellier)

The treatment of seawater, including its large-scale desalination, is a major challenge for our society. Reverse osmosis[1] is one of the most widely used techniques for the desalination of water. Some of the membranes currently used are artificial channels of water[2] inserted into lipid layers. But their large-scale performance is not satisfactory under real osmotic pressure and salinity conditions. An international team, involving researchers from KAUST (Saudi Arabia) and Politehnico di Torino (Italy) and coordinated by scientists from the Institut Européen des Membranes (CNRS/ENSC Montpellier/University of Montpellier), has developed a hybrid strategy, which consists of combining a polyamide matrix and artificial water channels into a single structure. Their membranes, which take the form of a sponge superstructure, have been tested under industrial conditions and outperform conventional membranes. Their flow is 75% higher than that observed with current industrial membranes and they require about 12% less energy for desalination. Their work is patented[3] and was published on November 9, 2020 in Nature Nanotechnology.

Credit: 
CNRS

Princeton researchers find key to piercing harmful bacteria's armor

image: In Gram-negative bacteria, LPS and phospholipids are manufactured at the inner bacterial membrane and must be delivered across the cell wall to the outer membrane. The manufacture and delivery of LPS to the outer bacterial membrane is carefully balanced against phospholipid levels because imbalances can be lethal to the cell. Princeton University researchers have identified a new bacterial protein that assists in delivering components to the outer membrane of the Gram-negative bacterium Escherichia coli, as they report in recent papers in PNAS and Trends in Microbiology.

Image: 
Silhavy Lab, Princeton University

Bacteria are single-celled organisms that are essential to human health, both in our environment and inside our own bodies. However, certain bacterial species can make us sick.

When a physician suspects an illness of bacterial origin, they will perform diagnostic tests to identify what bacterial species is causing disease so that a course of treatment can be devised. One of these tests is called the Gram stain, after Hans Christian Gram, who developed the technique in the 1880s. Gram discovered that certain bacterial species, the so-called "Gram-negative" bacteria, shrug off a purple dye he was using to help visualize the microbes under his microscope. Scientists eventually discovered that Gram-negative bacteria resist dye uptake because they are enveloped in what is, essentially, a microbial suit of armor: their vulnerable cell membrane is protected by a layer of tightly packed sugars called the cell wall, and on top of that, a specialized outer membrane.

"Understanding how bacteria build this barrier is an important step in engineering strategies to circumvent it," said Thomas Silhavy, the Warner-Lambert Parke-Davis Professor of Molecular Biology, and the senior author on two new papers investigating the outer membrane, one in the journal Proceedings of the National Academy of Sciences and the other in the journal Trends in Microbiology.

One of the main components of the outer membrane is a unique molecule called lipopolysaccharide (LPS), which covers the surface of the cell. "LPS helps to increase the mechanical strength of the Gram-negative cell envelope and it also forms a surface coating that prevents toxic molecules, including certain antibiotics, from entering the cell," said Randi Guest, a postdoctoral research associate in the Silhavy lab, a lecturer in molecular biology, and the lead author of the Trends article.

LPS is a famously potent toxin that can cause severe illness when it is released from the bacterial outer membrane or secreted by the cell.

"The amount of LPS produced by the cell is carefully controlled, as too little LPS may lead to cell rupture, while too much LPS, especially if not properly assembled, is toxic," said Guest. "We reviewed studies of three essential membrane proteins that monitor not only LPS biosynthesis inside the cell, but also transport to, and proper assembly at the cell surface."

As Guest and colleagues discuss in their article, the construction of the bacterial outer membrane represents a complex problem for bacteria because potentially dangerous LPS, made inside the cell, must be transported across the cell wall to reach the outer membrane. In addition, these processes must be balanced against the manufacture and transport of the other components of the membrane, which in Gram-negative bacteria is mainly made up of a class of molecules called phospholipids.

"One long-standing mystery in the field is how phospholipids are transported to the outer membrane," said Silhavy. One idea is that phospholipids can flow passively back and forth between the bacterium's inner cell membrane and its outer membrane at zones of contact, but this idea is highly controversial. New research from Silhavy's group provides support for the idea that a passive mode of transport does exist.

Jackie Grimm, then a graduate student in Silhavy's lab, together with Handuo Shi, a graduate student in KC Huang's laboratory at Stanford, led an effort to identify proteins involved in trafficking phospholipids between the inner and outer membranes. For their studies, the colleagues used bacteria that have a mutation that increases the rate at which phospholipids flow from the inner membrane to the outer membrane. When they are deprived of nutrients, these bacteria experience shrinkage and rupture of the inner membrane, followed by cell death, because they are unable to make new phospholipids for the inner membrane to replace those lost to the outer membrane. The authors introduced additional mutations into these bacteria, then looked for genes which, when mutated, affect how quickly the bacteria die after nutrient withdrawal.

"We used next-generation sequencing to screen for genes involved in this process and found that disruption of the gene yhdP slowed phospholipid transport," said Silhavy.

Although their data indicate that the protein encoded by yhdP is involved in phospholipid transport between the inner cell membrane and the outer membrane, Grimm, Shi and their colleagues noted that it's not yet clear how YhdP protein works to affect this process. A potential clue might be found in its predicted similarity to other proteins whose function is already known. One of these is a mammalian protein that forms a channel that transports phospholipids across membranes.

"This suggests that YhdP might form a hydrophobic channel between the inner and outer membrane through which phospholipids flow," noted Silhavy.

"Silhavy and colleagues provide the strongest data to date towards identifying how phospholipids are transported between membranes in bacteria, an elusive question for decades in our field," said M. Stephen Trent, Distinguished Professor of Infectious Diseases at the University of Georgia, who was not involved in the work. "They make a strong argument with genetics and biophysics that a protein of unknown function, YhdP, affects a rapid transport process for phospholipids between membranes. It will be really interesting to learn YhdP's role in phospholipid transport in the future."

Credit: 
Princeton University