Tech

Isoflavones in soybean help protect pigs against viral infections

image: Brooke Smith and Ryan Dilger, University of Illinois.

Image: 
College of ACES, University of Illinois.

URBANA, Ill. ­- Pigs that eat soybean as a regular part of their diet may be better protected against viral pathogens, a new study from University of Illinois shows. The researchers attribute the effect to isoflavones, a natural compound in soybeans.

Porcine reproductive and respiratory syndrome virus (PRRSV) is a widespread disease that costs U.S. swine producers around $650 million every year. There is evidence that feeding soy helps protect pigs against the disease, but it's not clear why or how it works, says Ryan Dilger, co-author on the study and associate professor in the Department of Animal Sciences, Division of Nutritional Sciences, and Neuroscience Program at U of I.

Dilger and his collaborators previously pointed to dietary soy isoflavones as the active ingredient, and they wanted to explore that hypothesis further.

"In this study, we're looking specifically at isoflavones and whether they have a beneficial effect on the immune response," Dilger says. "We wanted to understand how we can take a primary protein source in a diet that's already used for pigs and provide a practical way for producers to combat the endemic PRRSV."

Isoflavones are a flavonoid compound that occurs naturally in plants, with a particularly high concentration in soybeans. It has well-known health benefits and is used as a dietary supplement for humans, explains Brooke Smith, lead author of the study and graduate researcher in the Veterinary Medical Scholars Program at U of I.

"When they're included in the diet of infected pigs, these isoflavones seem to be supportive by either helping the pigs clear secondary infections or setting them up for a more successful immune response so they clear the infection and don't succumb to it," Smith says.

Dilger adds the research is unique in focusing on nutrition. "We are using something that's going through the gastrointestinal tract to try and alter the immune response to a virus which is actually in the lungs. So never does the virus come into direct contact with the isoflavones. These are two different systems," he says.

The study included 96 pigs, divided into three groups. Two groups were infected with PRRSV; one of these groups received a supplement of isoflavones while the other did not. A control group of non-infected pigs received a diet without isoflavones.

Infected pigs that did not consume isoflavones had about 50% higher rate of infection-related mortality than those receiving the supplement. Consequently, isoflavones in the diet could have a significant economic effect for producers, the researchers conclude.

In a second part of the study the researchers looked more specifically at whether isoflavones might benefit the immune system indirectly by changing profiles of bacteria in the large intestine of the pig.

"We did not know whether there was a direct effect of isoflavones on the immune system or whether it was a result of isoflavones benefitting resident bacteria, which then had an indirect effect on the host," Smith says.

They were able to rule out the indirect effect of isoflavones through bacteria. However, even though the researchers were not able to explain the biological mechanisms, it is clear isoflavones are beneficial.

Soybean is usually a part of pig diets, and the researchers recommend producers keep it that way. They say more studies are needed to determine the ideal amount for optimal benefits.

While isoflavones have a wide variety of anti-inflammatory and anti-oxidative cell activities, they also have estrogen-like components that can affect breeding females and change the reproductive cycle. Swine producers need to balance anti-viral effects with estrogen activity when determining the isoflavone level in diets for gilts.

The researchers point out that their findings can also have implications for human health. The inspiration for their study came from research in humans that looked at antiviral properties of isoflavones.

"We've brought the human context into the pig, and we've put it in the scenario of production agriculture. We learned something that may benefit swine producers, but certainly it goes back in the other direction as well, to potentially help with human health," Dilger explains.

Credit: 
University of Illinois College of Agricultural, Consumer and Environmental Sciences

Chemical thermometers take temperature to the nanometric scale

image: Temperature map of a gold nanowire on a silicon substrate, Joule-heated by the application of an electrical current of 7 mA, obtained through infrared thermography (top) and a spin-crossover surface thermometer (bottom). While heating remains undetectable in infrared due to low thermal and spatial resolution, temperature distribution is well resolved using an SCO-based thermometer, which reveals a "hot spot" resulting from a malfunction of the component.

Image: 
Ridier et al.

The miniaturisation of electronic components coupled with their increasing integration density has considerably expanded heat flows, which can lead to overheating. But how to measure these nanometric events when conventional solutions such as infrared thermography cannot go below a micrometre (1,000 times bigger than a nanometre)?

A research team bringing together scientists from two CNRS laboratories, the Coordination Chemistry Laboratory and the Laboratory for Analysis and Architecture of Systems, has proposed doing so by using the bistability properties of a family of chemical compounds known as spin-crossover (SCO) molecules. They exist into two electronic states with different physical properties, and can switch from one to the other when they absorb or lose energy. For instance, some of them change colour depending on the temperature.

Once deposited in the form of a film on an electronic component, the optical properties of SCO molecules change depending on the temperature, enabling this chemical thermometer to establish a nanometric-scale thermal map of the surface of microelectronic circuits. However, the primary feat of these SCO molecular films is actually their unique stability: the properties of the molecules remain unchanged, even after more than 10 million thermal cycles under ambient air and high temperatures (up to 230°C).

This innovation* overcomes the primary hurdle for SCO molecules, namely their fatigability, or the fact that their properties are often altered after multiple transitions from one electronic state to another. It could soon be used in the microelectronics industry to probe local thermal processes, and to thereby improve the design of future devices.

Credit: 
CNRS

Orderly arranged bead-chain ternary nanocomposites for supercapacitors

image: A schematic diagram of Cu2O-Mn3O4-NiO ternary nanocomposites preparation process. Compared with the traditional hydrothermal method, the materials prepared by electrospinning are nanostructure, which improved the electron transport capacity and the energy storage capacity of metal oxide. The acquired Cu2O-Mn3O4-NiO ternary nanocomposites were arranged in orderly metallic nanostructures, which should be one of interest for the development of supercapacitors electrode materials.

Image: 
Author

In a paper published in NANO, a group of researchers from Jiangsu University of Technology, China have developed novel Cu2O-Mn3O4-NiO ternary nanocomposites by electrostatic spinning technology, which improved the performance of supercapacitors electrode materials.

Supercapacitors feature high power density, long cycle life and present increasing significance as advanced energy storage devices. Nanomaterials and their composites are recognized as optimal candidates for energy materials because of their ease in charge conduction mechanisms, reduced dimensions and the effect of surface properties on their behavior provide better interfaces and chemical reaction rates.

However, the preparation of electrode materials is a key point affecting the performance of supercapacitors. When compared with other methods for fabricating nanofibers, electrospinning has attracted more and more attention because of its single steps and cost-effectiveness. Electrospinning metal oxide fibers is a promising method for generating composite nanofibers with a high specific surface area, high crystallinity, and an increased number of active sites. The resultant nanofibers are ideal for energy storage applications because the nanofibrous surface morphology provides a path for electron transport, which improves the energy storage capacity of the metal oxide.

In this work, the obtained nanocomposites (Cu2O-Mn3O4-NiO) are an ordered arrangement of metal oxide particles (10 nm), with the shape like bead-chain. The acquired Cu2O-Mn3O4-NiO ternary nanocomposites were used as electrode materials to manufacture a supercapacitor. Electrochemical tests showed that the synthesis of nanocomposites made electrode materials had good electrochemical performance in 6 mol/L KOH electrolyte. The results showed that at a scan rate of 5 mV/s, the specific capacitance of Cu2O-Mn3O4-NiO had a larger specific capacitance of 1306 F/g than NiO, Cu2O-NiO and Mn3O4-NiO. This ternary nanocomposites improved the electrochemical performance of electrode materials and can be used for efficient supercapacitors.

The successfully synthesized Cu2O-Mn3O4-NiO nanocomposites by electrospinning is adaptable for large and industrial scale production. The structural characterization and composition analysis explained the excellent behavior of Cu2O-Mn3O4-NiO. Due to the chemical reactions and hence strong interaction between the functional groups and electrolyte ions, Cu2O-Mn3O4-NiO nanocomposites exhibited outstanding electrochemical performance in terms of high specific capacitance and capacitance retention.

This work was supported by Postgraduate Research & Practice Innovation Program of Jiangsu Province (20820111964-SJCX19-0754) and (20820111950-SJCX19-0740), National Natural Science Foundation of China (grant no. 31800495), Natural Science Foundation of Jiangsu Province (grant no. BK20181040).

Credit: 
World Scientific

Reduction in commercial flights due to COVID-19 leading to less accurate weather forecasts

WASHINGTON--Weather forecasts have become less accurate during the COVID-19 pandemic due to the reduction in commercial flights, according to new research.

A new study in AGU's journal Geophysical Research Letters finds the world lost 50-75% of its aircraft weather observations between March and May of this year, when many flights were grounded due to the pandemic.

Aircraft typically inform weather forecasts by recording information about air temperature, relative humidity, air pressure and wind along their flight path. With significantly fewer planes in the sky this spring, forecasts of these meteorological conditions have become less accurate and the impact is more pronounced as forecasts extend further out in time, according to the study, which is part of an ongoing special collection of research in AGU journals related to the current pandemic.

Weather forecasts are an essential part of daily life, but inaccurate forecasts can also impact the economy, according to Ying Chen, a senior research associate at the Lancaster Environment Centre in Lancaster, United Kingdom and lead author of the new study. The accuracy of weather forecasts can impact agriculture as well as the energy sector and stability of the electrical grid. Wind turbines rely on accurate forecasts of windspeed and energy companies depend on temperature forecasts to predict what the energy load will be each day as people crank up their air conditioning.

"If this uncertainty goes over a threshold, it will introduce unstable voltage for the electrical grid," Chen said. "That could lead to a blackout, and I think this is the last thing we want to see in this pandemic."

The regions most impacted by the reduction in weather forecasts have been those with normally heavy air traffic, like the United States, southeast China and Australia, as well as isolated regions like the Sahara Desert, Greenland and Antarctica. Western Europe is a notable exception: its weather forecasts have been relatively unaffected despite the number of aircraft over the region dropping by 80-90%.

This was surprising, Chen said. Chen suspects the region has been able to avoid inaccuracies because it has a densely-packed network of ground-based weather stations and balloon measurements to compensate for the lack of aircraft.

"It's a good lesson which tells us we should introduce more observation sites, especially in the regions with sparse data observations," Chen said. "This will help us to buffer the impacts of this kind of global emergency in the future."

Chen also found precipitation forecasts around the world have not been significantly affected, because rainfall forecasts have been able to rely on satellite observations. But March, April and May have been relatively dry this year in most of the world, so Chen cautions that precipitation forecasts could potentially suffer as the hurricane and monsoon seasons arrive.

Comparing forecasts

Forecast models are more accurate when a greater number of meteorological observations are taken into account, and the number of observations is greatly diminished when fewer planes are in the air, as was the case in March-May of this year. The Aircraft Meteorological Data Relay program is comprised of over 3,500 aircraft and 40 commercial airlines, which typically provide over 700,000 meteorological reports a day.

When Chen compared the accuracy of weather forecasts from March-May 2020 to the same periods in 2017, 2018 and 2019, he found the 2020 forecasts were less accurate for temperature, relative humidity, windspeed and air pressure. This is despite the fact that in February, before flights were significantly impacted, weather forecasts were more accurate than in previous years.

He found surface pressure and wind speed forecasts were unaffected in the short term (1-3 days) but were less accurate for the longer-term (4-8 days) forecasts included in the study. In February, before the number of flights dropped off, forecast accuracy in several regions that rely on aircraft observations had actually improved by up to 1.5 degrees Celsius (35 degrees Fahrenheit) over previous years. But in March-May 2020, when flights were reduced by 50-75% compared to February, that improvement in accuracy vanished.

Chen found western Europe was the only region with normally high flight traffic that did not suffer remarkably reduced accuracy in temperature forecasts. He attributed this to over 1,500 meteorological stations that form a dense data collection network in the area.

However, European weather was particularly unvarying over the March-May 2020 time period, making it easier to forecast with less data, according to Jim Haywood, a professor of atmospheric science at the University of Exeter, United Kingdom, who was not involved with the new study. Haywood suspects this played a role in the persisting accuracy of western European forecasts in addition to the network of ground observation points.

The longer forecasters lack aircraft data, the more weather forecasts will be impacted, according to the study. While precipitation forecasts have so far been unaffected, scientists' ability to catch early warning signs of extreme weather events this summer could suffer. In the long term, the study results suggest sources of weather data should be diversified, especially in observation-sparse areas and areas that rely heavily on commercial flights, according to Chen.

Credit: 
American Geophysical Union

New insight into the origin of water on the earth

video: Organic matter analog producing water and oil by heating. Round water droplets were remarkably formed at approximately 350 °C. (Hideyuki Nakano et al., Scientific Reports, May 8, 2020)

Image: 
Hideyuki Nakano et al., Scientific Reports, May 8, 2020

Scientists have found the interstellar organic matter could produce an abundant supply of water by heating, suggesting that organic matter could be the source of terrestrial water.

There remains a number of mysteries on our planet including the elusive origin of water on the earth. Active studies suggested that terrestrial water had been delivered by icy comets or meteorites containing hydrous silicates that came from outside the "snow line" -- the boundary beyond which ice can condense due the low temperatures. More recent studies, however, have provided observations opposing to cometary origin theory, yet still failing to suggest plausible substitutions for the source of terrestrial water. "Until now, much less attention has been paid to organic matter, comparing to ices and silicates, even though there is an abundance inside the snow line" says planetary scientist Akira Kouchi at Hokkaido University.

In the recent study published in Scientific Reports, a group of scientists led by Akira Kouchi demonstrates that heating of the interstellar organic matter at high temperature could yield abundant water and oil. This suggests that water could be produced inside the snow line, without any contribution of comets or meteorites delivered from outside the snow line.

As a first step, the researchers made an analog of organic matter in interstellar molecular clouds using chemical reagents. To make the analog, they referred to analytical data of interstellar organics made by irradiating UV on a mixture containing H2O, CO, and NH3, which mimicked its natural synthetic process. Then, they gradually heated the organic matter analog from 24 to 400 ? under a pressured condition in a diamond anvil cell. The sample was uniform until 100 ?, but was separated into two phases at 200 ?. At approximately 350 ?, the formation of water droplets became evident and the sizes of the droplets increased as the temperature rose. At 400 ?, in addition to water droplets, black oil was produced.

The group conducted similar experiments with larger amounts of organic matter, which also yielded water and oil. Their analysis of absorption spectra revealed that the main component of the aqueous product was pure water. Additionally, chemical analysis of produced oil showed similar characteristics to the typical crude oil found beneath the earth.

"Our results show that the interstellar organic matter inside the snow line is a potential source of water on the earth. Moreover, the abiotic oil formation we observed suggests more extensive sources of petroleum for the ancient Earth than previously thought," says Akira Kouchi. "Future analyses of organic matter in samples from the asteroid Ryugu, which the Japan's asteroid explorer Hayabusa2 will bring back later this year, should advance our understanding of the origin of terrestrial water."

Credit: 
Hokkaido University

Researchers realize nanoscale electrometry based on magnetic-field-resistant spin sensor

A team led by Prof. DU Jiangfeng, Prof. SHI Fazhan and Prof. WANG Ya from University of Science and Technology of China (USTC) of the Chinese Academy of Sciences (CAS) proposed a robust electrometric method utilizing continuous dynamic decoupling (CDD) technique, where the continuous driving fields provide a magnetic-field-resistant dressed frame. The study was published in Physical Review Letters on June 19.

Characterization of electrical properties and comprehension of the dynamics in nanoscale become significant in the development of modern electronic devices, such as semi-conductor transistors and quantum chips, especially when the feature size has shrunk to several nanometers.

The nitrogen-vacancy (NV) center in diamond, an atomic-scale spin sensor, has shown to be an attractive electrometer. Benefited from the in situ compatibility with diamond-based semiconductor devices and the potential of electric-field imaging by combing the scanning technology, electrometry by using the NV center would advantage various sensing and imaging applications. However, its natural susceptibility to the magnetic field hinders effective detection of the electric field.

NV center is a defect in diamond, which consists of a substitutional nitrogen and an adjacent vacancy. The outstanding achievements of the NV center benefit from its remarkable properties such as, most notably, convenient state polarization and readout by a 532-nm laser and long coherence time due to the spin-purity environment.

In this study, the researchers used a Ramsey-like sequence in the dressed frame to measure the electric field. Also, they measured the dephasing of the near-surface NV centers (8 nm deep from the diamond surface) to evaluate the surface electric noise.

They demonstrated a robust method for nanoscale electrometry based on spin sensors in diamond. Comparing to the electrometry by applying a nonaxial magnet field, their method has the same susceptibility to the electric field, and more robust to the magnetic noise. Therefore, a higher electric-field sensitivity is achievable.

Besides, their electrometry is more applicable in the presence of strong magnetic field inhomogeneity or fluctuation, which is favorable for practical applications using near-surface NV centers, for example, the characterization of multiferroic materials.

They also use this method to study the noise environment of near-surface NV centers. By excluding the magnetic noise, they observed a quantitative relation between the dephasing rate of NV centers and the relative dielectric permittivity of surface covered liquids.

This study helps further understanding of the noise environment of near-surface NV centers, which is essential for a wide range of sensing applications and offers interesting avenues for nanoscale dielectric sensing.

Credit: 
University of Science and Technology of China

Honeybees reveal environmental pollution in their surroundings

image: Bees where the new tool was teste in Cordoba

Image: 
University of Cordoba

Honeybee colonies are bioindicators of environmental contamination in the area, since they get coated in everything that there is in the environment, including pollutants, and they end up taking it all back to their bee hives.

Bees sample a significant range of spaces, because they have a wide flight range, becoming covered with whatever build-up is in the air, water and the ground as well as on trees and flowers. In addition, when they reach the hive, they also transport nectar they have collected, which passes to the other bees, and spreads throughout the hive. However, the use of hives to understand the state of environmental contamination involves capturing bees and extracting what they have ingested and transported on the surface of their body. Also, sampling can be done with larvae, pollen reserves and honey. All of this is tedious and, at times, aggressive for the hive.

With the intention to continue obtaining information about environmental pollution that bees can provide, and without altering the normal functioning of hives, Professor José Manuel Flores, from the Department of Zoology at the University of Cordoba, collaborated on a European project at the University of Almeria, putting APISTrip into operation, a non-invasive tool to sample contaminants in hives.

APIStrip (Adsorb Pesticide In-hive Strip) is based on the use of a polystyrene strip upon which a concentrated solution of Tenax is applied, a product that can be used to collect samples that bees carry and later, adsorbed pesticides and pollutants on the surface are extracted and analyzed. To date, with this method, up to 442 kinds of pesticides can be detected.

When validating this technology, they have performed field studies in Cordoba and in Denmark. Professor José Manuel Flores led several APIStrip trials in the bee colony at the Rabanales Campus, testing different quantities of the product, different placements of the strip and different durations APIStrip was left in the hives. They determined that the ideal method to extract contaminants is using a 5x10 cm strip with 1g of Tenax for 14 days.

Two of the main toxic risks for bees comes from treatments applied by beekeepers to control a parasitic mite and nearby use of plant protection products. With the Denmark sampling, up to 40 different pesticide residues were found.

With this methodology, bees become a sample collector of their surroundings and bioindicators of environmental contamination without suffering any disruptions to their normal routine, thus allowing us to understand the environmental conditions of their surroundings and plan actions to improve environmental health.

Credit: 
University of Córdoba

Fast and flexible computation of optical diffraction

image: (a) Sketch of the optical system. (b) CGH displayed on the SLM for the generation of a 9×9 foci array. (c) The foci array on the focal plane of Lens 1 (P plane). (d) Phase distribution and (e) intensity distribution on the entrance pupil of the objective (E plane). (f) Simulated and (g) measured multi-foci array generated on the focal plane of the objective (F plane). (h) Enlarged intensity profile of a single focal spot in the array. The arrows indicate the polarization directions. (i) Longitudinal intensity profile and corresponding line plot of the foci array. (j) Simulated and (k) measured intensity distribution on the F plane when the CGH for the generation of the pattern "E" is encoded on the SLM. (l-m) Enlarged intensity profiles of the pattern corresponding to (j) and (k) with the same sampling points as in (i).

This research received funding from the National Natural Science Foundation of China, USTC Research Funds of the Double First-Class Initiative, Youth Innovation Promotion Association of the Chinese Academy of Sciences, and National Key R&D Program of China.

Image: 
by Yanlei Hu, Zhongyu Wang, Xuewen Wang, Shengyun Ji, Chenchu Zhang, Jiawen Li, Wulin Zhu, Dong Wu, Jiaru Chu

Diffraction is a classic optical phenomenon accounting for the light propagation. The efficient calculation of diffraction is of significant value towards the real-time prediction of light fields. The diffraction of electromagnetic (EM) waves can be catalogued into scalar diffraction and vector diffraction according to the validation of different approximation conditions. Although mathematical expressions for both optical diffraction have been presented authoritatively for ages, fundamental breakthroughs have rarely been achieved in computation algorithms. The direct integration method and Fast Fourier transform (FFT) method have been developed and proved to suffer from the limits of either low efficiency or poor flexibility. Therefore, the versatile computation of optical diffraction in an efficient and flexible fashion is highly demanded.

In a new paper published in Light Science & Application, a team of scientists, led by Professor Jiawen Li and Dong Wu from CAS Key Laboratory of Mechanical Behavior and Design of Materials, Key Laboratory of Precision Scientific Instrumentation of Anhui Higher Education Institutes, Department of Precision Machinery and Precision Instrumentation, University of Science and Technology of China, and co-workers have proposed an efficient full-path calculation method by exploring the mathematical similarities in scalar and vector diffraction. The scalar and vector diffraction are both expressed using the highly flexible Bluestein method. The computation time can be greatly reduced to the sub-second level, which is five orders of magnitude faster than that achieved by the direct integration approach and two orders of magnitude faster than that achieved by the FFT method. Furthermore, the ROIs and the sampling numbers can be arbitrarily chosen, endowing the proposed method with superior flexibility. Finally, full-path light tracing of a typical laser holographic system is presented with unprecedented computation speed, which agrees well with the experimental results. The proposed method holds great promise in the universal applications of optical microscopy, fabrication, and manipulation.

The Bluestein method is an elegant method conceived by L. Bluestein and further generalized by L. Rabiner et al., which is a promising tool in the engineer's arsenal in the field of digital signal processing. The Bluestein method is capable of performing more general Fourier transforms at arbitrary frequencies as well as boosting the resolution over the full spectrum, offering us a spectral zoom operation with high resolution and arbitrary bandwidth. These scientists summarize the work of the application of the Bluestein method in both scalar and vector diffraction computation:

"We revisited and deduced the integral formulas for scalar and vector diffraction in Fourier transform forms, and then utilize the Bluestein method to completely supplant the Fourier transform in a more flexible fashion. Based on this, optical diffraction is evaluated with designated ROIs and sampling numbers."

"A few representative examples are given for both scalar and vector diffraction to demonstrate the improvement in efficiency and flexibility. Moreover, full-path light tracing of an optical holographic system is presented with unprecedented computation speed. And the results are verified by the experimental measurements." they added.

"Some important adjustments are made to the conventional Bluestein method including the definition of complex starting point and additional phase shifting factor in order to cope with the realistic condition for optical calculations" the scientists emphasized. "The proposed fast and flexible method for retrieving the light field can find wide applications in the fields of optical microscopy, photolithography and optical manipulation" they forecast.

Credit: 
Light Publishing Center, Changchun Institute of Optics, Fine Mechanics And Physics, CAS

New technology speeds up organic data transfer

Researches are pushing the boundaries of data speed with a brand new type of organic LEDs.

An international research team, involving Newcastle University experts, developed visible light communication (VLC) setup capable of a data rate of 2.2?Mb/s by employing a new type of organic light-emitting diodes (OLEDs).

To reach this speed, the scientists created new far-red/near-infrared, solution-processed OLEDs. And by extending the spectral range to 700-1000?nm, they successfully expanded the bandwidth and achieved the fastest-ever data speed for solution-based OLEDs.

Described in the journal Light Science & Applications, the new OLEDs create opportunities for new internet-of-things (IoT) connectivity, as well as wearable and implantable biosensors technology.

The project is a collaboration between Newcastle University, University College London, the London Centre for Nanotechnology, the Institute of Organic Chemistry - Polish Academy of Sciences (Warsaw, Poland) and the Institute for the Study of Nanostructured Materials - Research National Council (CNR-ISMN, Bologna, Italy).

Dr Paul Haigh, Lecturer in Communications at Newcastle University's Intelligent Sensing and Communications Group, was part of the research team. He led the development of a real-time transmission of signals that transmit as quickly as possible. He achieved this by using information modulation formats developed in-house, achieving approximately 2.2 Mb/s.

Dr Haigh said: "Our team developed highly efficient long wavelength (far red/near-infrared) polymer LEDs for the first time, free of heavy metals which has been a long standing research challenge in the organic optoelectronics community. Achieving such high data rates opens up opportunities for the integration of portable, wearable or implantable organic biosensors into visible/ nearly (in)visible light communication links."

The demand for faster data transmission speeds is driving the popularity of light-emitting devices in VLC systems. LEDs have multiple applications and are used lighting systems, mobile phones and TV displays. While OLEDs don't offer the same speed as inorganic LEDs and laser diodes do, they are cheaper to produce, recyclable and more sustainable.

The data rate the team achieved through the pioneering device is high enough to support an indoor point-to-point link, with a view of IoT applications.

The researchers highlight the possibility of achieving such data rates without computationally complex and power-demanding equalisers. Together with the absence of toxic heavy metals in the active layer of the OLEDs, the new VLC setup is promising for the integration of portable, wearable or implantable organic biosensors.

Credit: 
Newcastle University

Psychology: The most personal device

Everyone who uses a smartphone unavoidably generates masses of digital data that are accessible to others, and these data provide clues to the user's personality. Psychologists at Ludwig-Maximilians-Universitaet in Munich (LMU) are studying how revealing these clues are.

For most people around the world, smartphones have become an integral and indispensable component of their daily lives. The digital data that these devices incessantly collect are a veritable goldmine - not only for the five largest American IT companies, who make use of them for advertising purposes. They are also of considerable interest in other contexts. For instance, computational social scientists utilize smartphone data in order to learn more about personality traits and social behavior. In a study that appears in the journal PNAS, a team of researchers led by LMU psychologist Markus Bühner set out to determine whether conventional data passively collected by smartphones (such as times or frequencies of use) provide insights into users' personalities. The answer was clear cut. "Yes, automated analysis of these data does allow us to draw conclusions about the personalities of users, at least for most of the major dimensions of personality," says Clemens Stachl, who used to work with Markus Bühner (Chair of Psychological Methodologies and Diagnostics at LMU) and is now a researcher at Stanford University in California.

The LMU team recruited 624 volunteers for their PhoneStudy project. The participants agreed to fill out an extensive questionnaire describing their personality traits, and to install an app that had been developed specially for the study on their phones for 30 days. The app was designed to collect coded information relating to the behavior of the user. The researchers were primarily interested in data pertaining to communication patterns, social behavior and mobility, together with users' choice and consumption of music, the selection of apps used, and the temporal distribution of their phone usage over the course of the day. All the data on personality and smartphone use were then analyzed with the aid of machine-learning algorithms, which were trained to recognize and extract patterns from the behavioral data, and relate these patterns to the information obtained from the personality surveys. The ability of the algorithms to predict the personality traits of the users was then cross-validated on the basis of a new dataset. "By far the most difficult part of the project was the pre-processing of the huge amount of data collected and the training of the predictive algorithms," says Stachl. "In fact, in order to perform the necessary calculations, we had to resort to the cluster of high-performance computers at the Leibniz Supercomputing Centre in Garching (LRZ)."

The researchers focused on the five most significant personality dimensions (the Big Five) identified by psychologists, which enable them to characterize personality differences between individuals in a comprehensive way. These dimensions relate to the self-assessed contribution of each of the following traits to a given individual's personality: (1) openness (willingness to adopt new ideas, experiences and values), (2) conscientiousness (dependability, punctuality, ambitiousness and discipline), (3) extraversion (sociability, assertiveness, adventurousness, dynamism and friendliness), (4) agreeableness (willingness to trust others, good natured, outgoing, obliging, helpful) and (5) emotional stability (self-confidence, equanimity, positivity, self-control). The automated analysis revealed that the algorithm was indeed able to successfully derive most of these personality traits from combinations of the multifarious elements of their smartphone usage. Moreover, the results provide hints as to which types of digital behavior are most informative for specific self-assessments of personality. For example, data pertaining to communication patterns and social behavior (as reflected by smartphone use) correlated strongly with levels of self-reported extraversion, while information relating to patterns of day and night-time activity was significantly predictive of self-reported degrees of conscientiousness. Notably, links with the category 'openness' only became apparent when highly disparate types of data (e.g., app usage) were combined.

The results of the study are of great value to researchers, as studies have so far been almost exclusively based on self-assessments. The conventional method has proven to be sufficiently reliable in predicting levels of professional success, for instance. "Nevertheless, we still know very little about how people actually behave in their everyday lives - apart from what they choose to tell us on our questionnaires," says Markus Bühner. "Thanks to their broad distribution, their intensive use and their very high level of performance, smartphones are an ideal tool with which to probe the relationships between self-reported and real patterns of behavior.

Clemens Stachl is aware that his research might further stimulate the appetites of the dominant IT firms for data. In addition to regulating the use of passively collected data and strengthening rights to privacy, we also need to take a comprehensive look at the field of artificial intelligence, he says. "The user, not the machine, must be the primary focus of research in this area. It would be a serious mistake to adopt machine-based methods of learning without serious consideration of their wider implications." The potential of these applications - in both research and business - is tremendous. "The opportunities opened up by today's data-driven society will undoubtedly improve the lives of large numbers of people," Stachl says. "But we must ensure that all sections of the population share the benefits offered by digital technologies."

Credit: 
Ludwig-Maximilians-Universität München

Analysis finds multiple social disadvantages magnify stroke risk

DALLAS, July 16, 2020 -- Having more social disadvantages can nearly triple your risk of stroke, particularly if you are under the age of 75, according to new research published today in Stroke, a journal of the American Stroke Association, a division of the American Heart Association.

Researchers already know that some social disadvantages, such as living in an impoverished or rural area, having a low education or income level, lacking health insurance or being Black, may contribute to increased stroke risk. In this study, researchers investigated if there's a cumulative effect from having multiple social disadvantages - known as social determinants of health (SDOH).

"We were focused on understanding how having multiple social determinants of health affect stroke risk, and we found significant health disparities that have a profound impact on people's lives, especially in vulnerable populations," said co-first study author Evgeniya Reshetnyak, Ph.D., a senior research data analyst at Weill Cornell Medicine in New York City.

"Our study shows that the risk of stroke is amplified among individuals with multiple social determinants of health factors, especially for those who are younger than 75 years old. There is a cumulative effect of multiple social determinants of health. In fact, every additional disadvantage further increases stroke risk."

Using data from participants in the REasons for Geographic And Racial Differences in Stroke (REGARDS) study, researchers examined information for 27,813 Black and white adults (average age 64.7) who lived in the contiguous U.S. (48 states) and the District of Columbia and were followed for 10 years. Data from America's Health Ranking, which ranks public health infrastructure by state, were used to define states with poor public health infrastructure. During the REGARDS study, 1,470 incidents of stroke were reported among the participants.

Researchers found, among those younger than 75 years old compared to people with no social determinants of health factors:

there is a cumulative effect of multiple social determinants of health - the risk of stroke is increased among those individuals with multiple social determinants of health;
people with three or more social determinants of health were nearly two and a half times more likely to have incident stroke; and
after adjusting for other risk factors, stroke risk remained 50% higher among those with three or more social determinants of health.
In addition, researchers noted:

Black women specifically were more likely to have a greater number of social disadvantages; people with more social determinants of health were also more likely to have more traditional risk factors such as hypertension or Type 2 diabetes; and
residents in the Southeastern part of the U.S. (North and South Carolina, Georgia, Tennessee, Mississippi, Alabama, Louisiana and Arkansas) were at higher risk for stroke due to poor dietary habits and less investment in social safety nets.
"There is a need for policies and interventions that specifically target younger vulnerable populations," Reshetnyak said. "Early interventions are crucial for reducing stroke disparities. Although social determinants of health are difficult to change, their effect can be mitigated with timely interventions. However, programs may not be as effective at later ages when physiological factors may begin to dominate over social factors. "

"Health care professionals should pay special attention to those patients with multiple social determinants of health. Physicians should emphasize the importance of lifestyle changes, more aggressively control risk factors, and recommend available outreach and educational programs that could help reduce stroke risk."

Further research is needed to identify which social determinants of health contribute the most to stroke risk so future policy decisions could be prioritized.

Limitations of the study include that some data was self-reported, and tobacco use and exposure was limited to traditional cigarettes. Other social determinants of health, including perceived discrimination, police brutality, racial discrimination in the penal system and environmental factors, were not included in the study. Additionally, Latinos and Asians and other vulnerable populations were not included in this study.

Credit: 
American Heart Association

Rare mutation of TP53 gene leaves people at higher risk for multiple cancers

image: A modeling of the TP53 mutation detailed in this study

Image: 
Penn Medicine

PHILADELPHIA - Rare inherited mutations in the body's master regulator of the DNA repair system - the TP53 gene - can leave people at a higher risk of developing multiple types of cancer over the course of their lives. Now, for the first time, a team led by researchers in the Basser Center for BRCA at the Abramson Cancer Center of the University of Pennsylvania details the potential implications of a lower risk TP53 mutation, including an association with a specific type of Li-Fraumeni syndrome (LFS), an inherited predisposition to a wide range of cancers. The researchers published their findings today in Cancer Research, a journal of the American Association for Cancer Research.

Mutations in the TP53 gene are the most commonly acquired mutations in cancer. The p53 protein, made by the TP53 gene, normally acts as the supervisor in the cell as the body tries to repair damaged DNA. Different mutations can determine how well or how poorly that supervisor is able to direct the response. The more defective the mutation, the greater the risk. When TP53 mutations are inherited, they cause LFS, a disease that leaves people with a 90 percent chance of developing cancer in their lifetime. These commonly include soft tissue and bone sarcomas, breast and brain cancer, adrenocortical tumors, and leukemia, and patients undergo frequent screening starting as infants to look for signs of disease, given the high risk of childhood cancers that continues throughout their lives. There are currently no therapies that target the p53 pathway.

"Due to the wide variety of disease types associated with inherited TP53 mutations and the early age of cancer diagnoses, cancer screening is exceptionally aggressive. However we do not yet know if all mutations require the same high level of screening," said the study's senior author Kara N. Maxwell, MD, PhD, an assistant professor of Hematology-Oncology and Genetics in the Perelman School of Medicine at the University of Pennsylvania and a member of the Abramson Cancer Center and the Basser Center for BRCA. "It is therefore critical to study the specifics of individual TP53 mutations so we can understand how best to screen people who carry lower risk mutations."

The study's co-first authors are Jacquelyn Powers, MS, LCGC, a genetic counselor in the Basser Center for BRCA, and Emilia Modolo Pinto, PhD, an associate scientist at St. Jude Children's Research Hospital.

For this study, researchers genetically sequenced multiple members of eight different families, then combined those data into a study cohort, along with data of carriers from two genetic testing cohorts. They found this newly identified mutation confers a risk of childhood cancer in some families, but only later onset cancers in others, and likely at a lower than 90 percent lifetime risk.

The St. Jude team, as well as researchers from The Wistar Institute, were critical to the next stage of the research.

The St. Jude team modeled the TP53 mutation to try to figure out what effect it had on the protein's structure. In addition, working closely with the Penn team, the St. Jude team also determined that there is an inherited set of genetic material shared among people who have this mutation, suggesting it's what's called a founder mutation - a mutation that tracks within one ethnicity. In this case, that ethnicity is the Ashkenazi Jewish population.

"By using the same model that led us to determine a founder mutation widespread in Brazilian individuals we have determined this new one in the Ashkenazi Jewish population," Pinto said.

A team from The Wistar Institute studied the consequences of the new mutation on the function of the p53 protein in cells and showed it affects the expression levels of multiple p53 target genes, suggesting this might play a role in transformation and cancer formation.

"By identifying and understanding this Ashkenazi variant of p53, our goal is to help people who have genetic variants of this critical gene to better understand their cancer risk, and eventually to assist the development of new specific treatments that will reduce the burden of cancer on this population," said Maureen E. Murphy, PhD, Ira Brind Professor and program leader of the Molecular & Cellular Oncogenesis Program of the Wistar Cancer Center.

The findings raise questions about how to appropriately screen patients for this mutation and whether the standard process of full-body scans for LFS patients should be modified for this group, since their risk profile is different than those with classic LFS.

Maxwell and her team are part of a group working on developing liquid biopsy techniques - a blood test that would be able to improve detection. They say they're hopeful this study will help inform future liquid biopsy work.

Credit: 
University of Pennsylvania School of Medicine

New map for radioactive soil contamination in Western Europe

An international consortium of scientists has refined the map of caesium and plutonium radionuclide concentrations in soils in Switzerland and several neighbouring countries. Using an archive of European soil samples, the team led by Katrin Meusburger from the University of Basel, now at the WSL research institute, was able to trace the sources of radioactive fallout between 1960 and 2009. This study was published in Scientific Reports.

The new map comprises Switzerland as well as several neighbouring countries (France, Italy, Germany, Belgium). It is based on a new calculation method, namely the use of the caesium/plutonium ratio. These two radionuclides were released during military nuclear tests, particularly in the 1960s, but caesium also during the Chernobyl accident in 1986.

"We have created a new map to provide a basis for estimating the soil loss since the anthropogenic release of radionuclides," says first author Katrin Meusburger from the group Environmental Geocience at the University of Basel, who is now working at the Swiss Federal Institute for Forest, Snow and Landscape Research WSL. "To do this, it is important to know the proportion of radioactive fallout from Chernobyl."

The data collected, made available to the scientific community and the public, are useful for establishing a reference base in the event of possible future fallout of radionuclides, but also for use in new studies, particularly in geomorphology. They will, for example, allow the reconstruction of soil erosion rates since the 1960s in areas of Europe where there have been major landscape changes.

Higher accuracy and resolution

The consortium researchers used 160 samples from a European soil sample bank from 2009. These samples were taken from soils under grassland, which have remained stable since the 1960s (absence of erosion and accumulation) and are representative of the variability of rainfall conditions observed in the countries covered by the study.

The radionuclides found in these samples, caesium and plutonium (137Cs, 239Pu, 240Pu), left a specific footprint in European soils. Indeed, in the countries covered by the study, the plutonium came exclusively from the nuclear tests. As for caesium, it is the result of both nuclear tests, particularly in the 1960s, and the 1986 Chernobyl accident. The relationship between cesium and plutonium is therefore different depending on whether it comes from nuclear tests or from the Chernobyl accident. It is this relationship that has enabled researchers to trace the origin of these artificial radionuclides deposited on European soils. "Unlike with the previous map, we can now distinguish between the sources of nuclear fallout," says Meusburger.

The study concludes that the caesium resulting from the nuclear tests - carried out in the stratosphere, i.e. at high altitude - circulated in the atmosphere before being brought down to the ground by the rains in a fairly homogeneous manner but with a slightly higher amounts in the rainiest regions, such as the Massif Central, the Ardennes or Brittany. On the other hand, the caesium released during the Chernobyl accident did not reach such altitudes; it remained at tropospheric level. The scattered rains that occurred in late April/early May 1986 quickly brought it back to the ground in areas where the plume from Ukraine had circulated. The spatial distribution of radioactive fallout is thus much more heterogeneous, with locally higher concentrations in Alsace, Franche-Comté and the foothills of the Alps, northern Italy and southern Germany.

Credit: 
University of Basel

Lesion of doom -- how a parasitic bacterium induces blood vessel formation to cause lesions

image: Aorta tissues that were not exposed to BafA (left) did not sprout new vessels, while BafA-exposed aorta tissues (right) did.

Image: 
Kentaro Tsukamoto

Bacteria of the genus Bartonella are parasites that can be transmitted to humans via insect bites and animal scratches, resulting in an infection known as "bartonellosis." Cat-scratch disease and trench fever are forms of bartonellosis caused by different Bartonella species infecting humans. Bartonella bacteria can cause lesions to pop up in the skin and internal organs. To provide themselves with a safe habitat, the bacteria bring about the increase of the number of "vascular endothelial" cells (cells that line the interior of blood vessels), which hide themselves from the host immune system and stimulate the creation of new blood vessels, through a process called "angiogenesis."

Previous studies on Bartonella henselae (B. henselae for short), the bacterium responsible for cat-scratch disease, have shown that it can directly "inject" proteins that inhibit programmed cell death (apoptosis) into the endothelial cells. However, B. henselae can also promote angiogenesis without directly contacting endothelial cells, which implies that the bacterium can secrete a bioactive substance that takes on the duty of kick-starting angiogenesis.

In a new study published in Nature Communications, a team of scientists led by Senior Assistant Professor Kentaro Tsukamoto and Professor Yohei Doi of Fujita Health University, Japan, have identified that this bioactive substance is actually a protein. They have also named this protein as Bartonella angiogenic factor A, or "BafA" for short. This is the very first report of a vascular endothelial growth factor (VEGF for short)-like protein produced by bacteria.

The scientists started their project by introducing B. henselae into human endothelial cells in petri dishes, and observed that the bacteria caused the endothelial cells to multiply. To identify the genes that give B. henselae this ability, the researchers began inducing random mutations in the DNA of the bacteria and seeing whether the mutated bacteria could still make the endothelial cells multiply. Through these experiments, the scientists determined that B. henselae can stimulate angiogenesis in human endothelial cells only if it possesses a functional copy of the gene that "codes for," or guides the synthesis of, the BafA protein. They also observed that exposing human endothelial cells to the isolated BafA protein caused the cells to multiply.

Then, to confirm that BafA stimulates angiogenesis, the scientists extracted samples of a major blood vessel called the aorta from mice and placed the samples in gels that did or did not contain BafA. As can be seen in the image below, the aorta samples that were not exposed to BafA did not sprout new blood vessels, but the aorta samples that were exposed to BafA grew vessels that extended into the gel. The scientists also found that surgically placing a BafA-containing gel plug into living mice led to blood vessels growing from the surrounding tissue into the gel.

Further experiments with human endothelial cells in petri dishes indicated that BafA activated cell surface receptors that recognize VEGF. By binding to these receptors, BafA triggered the activation of a process inside the cells, involving proteins called mitogen-activated protein kinase (MAPK) and extracellular signal-regulated kinases (ERKs). The MAPK/ERK pathway plays an important role in the multiplication of endothelial cells and angiogenesis. "In the last set of experiments, we performed similar studies in a related bacterium called Bartonella quintana, the bacterium that causes trench fever, and we found that it produces its own version of BafA that also causes human endothelial cells to multiply," explains Dr Tsukamoto.

These findings provide valuable insights into the mechanisms by which infectious bacteria can produce lesions in their hosts. "We believe that BafA proteins can be leveraged as tools for studying angiogenesis, and we also consider potential medical benefits," reports Prof Doi. "Most importantly," he elaborates, "BafA is a potential target for the development of diagnostic and therapeutic strategies for bartonellosis."

The scientists also speculate that BafA proteins could be used in regenerative medicine, which is a highly specialized branch of medicine that deals with replacing or regenerating lost or damaged parts of the body. Further research is needed to confirm the scientists' findings, but needless to say, BafA proteins will certainly be of immense interest to the scientific community.

Credit: 
Fujita Health University

Healthy offspring from testicular tissue plantation in mice: Retinoic acid key

image: Repetitive RA treatment improves the spermatogenesis of testis pieces transplanted into the testis interstitium of germ cell-depleted mouse.

Image: 
Copyright © 2020, Springer Nature, licensed under cc by 4.0

One in fifty births in Japan are said to be through in vitro fertilization, and many couples remain unable to conceive for reasons unknown. Infertility is also an undesired side-effect of lifesaving cancer treatment in childhood.

Akihiro Tsuchimoto, Masaaki Tone and Seiji Takashima of Shinshu University transplanted testis tissue to observe its effectiveness as a treatment for male infertility. When just the sperm stem cells are transplanted, the germ cells of the recipient (individual who is being treated) needs to be depleted to ensure engraftment and growth of the transplanted stem cells. However, contrary to their expectation, depletion of recipient germ cells actually worsened sperm production in transplanted tissues. They set out to find out why.

The reason for the negative prognosis in testis tissue transplantation was due to the depletion of germ cells which contribute to retinoic acid (RA) synthesis in the testes. In this context, they succeed in ameliorate donor spermatogenesis by RA administration in recipients to produce healthy offspring via intracytoplasmic sperm injection, a typical method in assisted reproductive technology in humans.

Although RA can produce teratogenicity when administered to pregnant mothers, an adverse effect was not observed during this treatment, suggesting that testis tissue transplantation in combination with RA administration on demand is an effective method for preserving/recovering male fertility. The researchers continue their work to preserve and restore male fertility. Testicular tissue transplantation should be uncomplicated to put into application, so the researchers are hopeful. There are still parameters that need to be considered and investigated to be put to use in reproductive technology.

For more information please read, Germ cell depletion in recipient testis has adverse effects on spermatogenesis in orthotopically transplanted testis pieces via retinoic acid insufficiency.

Credit: 
Shinshu University