Tech

Statewide prevalence of gun ownership tied to police use of lethal force

Police use of lethal force in the United States has triggered public scrutiny of violent interactions between police and citizens. Past research has focused on whether race and levels of violence contribute to this phenomenon. A new study expands on prior research by examining the impact of the availability of firearms. It finds a pronounced positive relationship between statewide prevalence of gun ownership by citizens and police use of lethal force.

The study, by a researcher at Carnegie Mellon University, appears in The Annals of the American Academy of Political and Social Science.

"One consequence of higher rates of firearm prevalence in a state may lead to a greater frequency of police encountering individuals who are armed or suspected to be armed, which in turn results in a greater frequency of police using lethal force," explains Daniel S. Nagin, professor of public policy and statistics at Carnegie Mellon University's Heinz College, the author of the study.

Nagin based his work on data collected in a Washington Post inventory of police use of lethal force nationwide from 2015 to 2018. As a first step in his analysis, Nagin confirmed a positive correlation across all 50 states between the prevalence of firearms and percentage of lethal encounters in which the victim possessed a firearm.

Nagin's analysis identified a pronounced, highly significant association between the statewide rate at which police use lethal force and the statewide prevalence of firearms. The association could be interpreted as reflecting a causal effect of the availability of firearms on police use of lethal force, Nagin suggests.

Because the data measured only violent encounters with the police that ended in death, not all violent encounters, taking account of proximity of trauma services is important. The analysis found that access to trauma centers, measured by the percentage of a state's population living within one hour of a Level I trauma center (one that can provide total care for every aspect of injury) or a Level II trauma center (one that has 24-hour immediate coverage by general surgeons and specialists), was associated with lower rates of deaths of individuals who have violent encounters with police; the finding supports the importance of rapid access to emergency medical care.

"One of the policy implications of the study's findings is that we should reduce the availability of firearms to active offenders and individuals at high risk of offending," Nagin suggests. "Among the policies intended to do this are universal background checks and training police to use de-escalation skills."

Credit: 
Carnegie Mellon University

Producing single photons from a stream of single electrons

Researchers at the University of Cambridge have developed a novel technique for generating single photons, by moving single electrons in a specially designed light-emitting diode (LED). This technique, reported in the journal Nature Communications, could help the development of the emerging fields of quantum communication and quantum computation.

A single photon, the elementary particle of light, can carry a quantum bit of information over hundreds of kilometres. Therefore, a source that can generate single photons is an important building block in many quantum technologies. Up to now, single-photon sources have been made in research labs from self-assembled quantum dots in semiconductors, or structural defects in diamonds. The formation of these dots and defects is a random process, so it is hard to predict the location and the photon energy (or wavelength) of these single-photon sources. This randomness may pose a challenge in integrating a source into a large quantum network.

In this article, the researchers show that they can generate a single photon in a different, controlled, way, without the need for a quantum dot or a defect, by moving only one electron at a time to recombine with a 'hole' (a missing electron in a filled 'band' of electrons).

'Imagine trying to send a digital message by firing a stream of blue or red balls over a wall in the following way. A conveyor belt with ball-sized indentations drags a series of white balls up a slope and drops the balls off a cliff at the end. Each ball picks up speed as it falls, is then sprayed blue or red (depending on the message) as it bounces off to the side and over the wall', explains Dr Tzu-Kan Hsiao, who did the experiment during his PhD at Cambridge.

`The indentations in the conveyor belt can only carry one ball each.

Only one ball gets sprayed at a time, and there's no chance some of the balls are intercepted by an eavesdropper without the person on the receiving end noticing a missing ball, whereas if sometimes two or more balls come at a time, the eavesdropper can catch odd balls and the receiver is none the wiser. In that way, some of the message may be unintentionally disclosed.'

'In the experiment, we made a device near the surface of Gallium Arsenide (GaAs) by using only industry-compatible fabrication processes. This device consists of a region of electrons close to a region of holes, and a narrow channel in between', says Professor Christopher Ford, team leader of the research.

'In order to transport only one electron at a time, we launch a sound wave along the surface. In GaAs such a ``surface acoustic wave'' also creates an accompanying electrical potential wave, in which each potential minimum carries just one electron. The potential wave, like a conveyor belt, brings individual electrons to the region of holes one after another. A series of single photons is generated when each electron quickly recombines with a hole before the next electron arrives.

Each single photon could be given one of two polarisations to carry a message such that an eavesdropper cannot intercept the message without being detected.

In addition to being a novel single-photon source, more importantly, it may be possible with this new technique to convert the state of an electron spin to the polarisation state of a photon. By bridging semiconductor-based quantum-computers using single photons as 'flying' qubits, the ambitious goal of building large-scale distributed quantum-computing networks may be achieved.

Credit: 
University of Cambridge

New guidelines for hepatic failure in the intensive care unit

February 14, 2020 - For critical care specialists, hepatic failure poses complex challenges unlike those of other critical illnesses. A new set of evidence-based recommendations for management of liver failure is presented in the March issue of Critical Care Medicine, the official journal of the Society of Critical Care Medicine (SCCM). The journal is published in the Lippincott portfolio by Wolters Kluwer. The guidelines are being presented at the SCCM 49th Critical Care Congress.

The new guidelines assemble recommendations for critical care specialists managing the wide range of conditions and complications posed by liver failure - a serious organ derangement that carries a high risk of death, and for which liver transplantation may be the only definitive treatment. Rahul Nanchal, MD, MS, FCCM, of Medical College of Wisconsin, Milwaukee, and Ram Subramanian, MD, FCCM, of Emory University Hospital were Co-Chairs of the Guidelines Committee.

Recommendations on Liver Failure in Five Key Areas

As for all SCCM guidelines, the multidisciplinary, international expert Committee followed a rigorous approach to reviewing the best available evidence and developing consensus guidelines to answer a defined set of clinical questions. Two forms of hepatic failure are addressed. Acute liver failure (ALF) is a life-threatening condition associated with rapid loss of liver function - over a period of days or weeks - in a previously healthy person. Acute- on -chronic liver failure (ACLF) develops in a patient with pre-existing chronic liver disease.

Critically ill patients with liver disease are at risk of unique manifestations affecting various organ systems. "Strategies used to manage organ complications in general critical illness are not always applicable to the care of the patient with liver failure," according to the guideline statement. Through a formal review and guideline development process, the Committee approved 29 evidence-based recommendations in five areas:

Cardiovascular. Patients with ALF or ACLF are at risk of circulatory abnormalities leading to inadequate blood flow (shock). Recommendations address the choice of resuscitation fluid and use of vasopressor drugs in patients ALF/ACLF in shock. The guidelines also include recommendations for blood pressure monitoring, including the use of invasive hemodynamic monitoring.

Hematology. Recommendations address the risks of bleeding and venous thromboembolism (VTE - blood clot-related complications) associated with ALF and ACLF. The guidelines address prevention and treatment of VTE and assessment of bleeding risk before invasive procedures.

Pulmonary. Several recommendations address preferred strategies for mechanical ventilation in patients with ALF/ACLF. The guidelines also discuss the management of some unique complications of chronic liver failure, including hepatopulmonary syndrome (low oxygen levels due to abnormal blood flow to the lungs) and hepatic hydrothorax (excess fluid around the lungs).

Renal. Issues related to decreased kidney function in patients with ALF/ACLF are analyzed. Due to lack of evidence, no recommendation is made for renal replacement therapy (RRT, or dialysis) during liver transplant surgery. Early RRT is recommended for patients with acute kidney injury, a common complication of ALF. Other recommendations address hepatorenal syndrome, a distinct type of kidney injury occurring in patients with cirrhosis.

Endocrine and Nutrition. The guidelines include recommendations for monitoring and control of blood glucose levels in patients with ALF/ACLF; the role of stress-dose steroids in patients with septic shock; and the use of enteral feeding. Recommendations call for drug screening to identify the wide range of potential causes of ALF/ACLF and medication dose adjustment for patients with decreased liver function.

Although the guidelines reflect the latest research on each topic, most of the recommendations are based on "low-quality indirect evidence" - for a few clinical questions, no evidence-based recommendation could be made. The Committee highlights areas in need of further research to better inform clinical practice. While acknowledging the limitations of the recommendations regarding the complex challenges in critically ill patients with ALF/ACLF, the Guideline authors conclude: "Our approach led to the generation of a contemporary document that can be used as a reference for clinicians."

Credit: 
Wolters Kluwer Health

Mayo Clinic study looks at changes in outcomes for coronary revascularization

ROCHESTER, Minn. -- The most common type of heart disease -- coronary artery disease -- affects 6.7% of adults and accounts for 20% of 2 in 10 deaths of adults under age 65. The condition builds over time as inflammation and cholesterol-containing plaques settle in the heart's arteries, where they can eventually cause narrowing and blockages that lead to a heart attack.

In addition to lifestyle changes and medication, some people undergo revascularization, which is a medical procedure to open blocked coronary arteries and restore healthy blood flow. These procedures and the demographics of patients receiving revascularizations have changed over time.

Researchers evaluated nationwide data from 2003 to 2016 to assess current trends in the risk profiles and number of patients who had revascularization to treat coronary artery disease. They also assessed the in-hospital outcomes of more than 12 million revascularizations, 72% performed through percutaneous coronary intervention (PCI) and 28% by coronary artery bypass grafting (CABG). PCI, also known as angioplasty, uses a tiny balloon catheter that is inserted in a blocked blood vessel to widen it and may involve placing a mesh stent to hold the vessel open. CABG involves taking a healthy blood vessel from elsewhere in the body and connecting it beyond the blocked artery in the heart to redirect blood flow.

The resulting study, "Trends in Characteristics and Outcomes of Patients Undergoing Coronary Revascularization in the United States, 2003-2016," was published in JAMA Network Open. Mohamad Alkhouli, M.D., is first and corresponding author on the study and Amir Lerman, M.D., is senior author on the study. Drs. Alkhouli and Lerman are Mayo Clinic cardiologists.

"There was an increase in the proportion of elderly patients (more than 85 years old), patients from racial minorities and those with lower income over the time period that we studied. Interestingly, men constituted two-thirds of the overall revascularization population, and this did not change much over time," says Dr. Alkhouli.

"We noted a substantial decrease in the number of both PCI and CABG over time, likely because patients with stable coronary artery disease are increasingly being managed medically in light of the results of clinical trials demonstrating the effectiveness of that approach," explains Dr. Alkhouli.

The percentage of people who had revascularization following a heart attack increased significantly, increasing from 22.8% from 2003 to 2007 to 53.1% from 2013 to 2016. Patients with severe heart failure tended to have a higher number of other negative health issues.

While the study was not designed to compare the two methods of revascularization, findings over time showed decreased in-hospital mortality after coronary artery bypass grafting but not after percutaneous coronary intervention. Dr. Alkhouli says this is somewhat surprising, but it may be at least partially explained by the shift in the clinical risk profile of patients undergoing percutaneous coronary intervention. He noted that although rigorous risk adjustment methodology was applied in the research, it is difficult to completely adjust for difference in risk factors in cohort studies.

"Several important questions remain open," says Dr. Alkhouli. "What can we do to further improve the outcomes of PCI, especially in the acute setting? How will the volumes and outcomes of PCI and CABG look in the next 10 years in light of the findings of the Ischemia trial? I suspect more research will emerge on these topics in the next few years."

Credit: 
Mayo Clinic

AI helps predict heart attacks and stroke

Artificial intelligence has been used for the first time to instantly and accurately measure blood flow, in a study led by UCL and Barts Health NHS Trust.

The results were found to be able to predict chances of death, heart attack and stroke, and can be used by doctors to help recommend treatments which could improve a patient's blood flow.

Heart disease is the leading global cause of death and illness. Reduced blood flow, which is often treatable, is a common symptom of many heart conditions. International guidelines therefore recommend a number of assessments to measure a patient's blood flow, but many are invasive and carry a risk.

Non-invasive blood flow assessments are available, including Cardiovascular Magnetic Resonance (CMR) imaging, but up until now, the scan images have been incredibly difficult to analyse in a manner precise enough to deliver a prognosis or recommend treatment.

In the largest study of its kind, funded by British Heart Foundation and published in the journal Circulation, researchers took routine CMR scans from more than 1,000 patients attending St Bartholomew's Hospital and the Royal Free Hospital and used a new automated artificial intelligence technique to analyse the images. By doing this, the teams were able to precisely and instantaneously quantify the blood flow to the heart muscle and deliver the measurements to the medical teams treating the patients.

By comparing the AI-generated blood flow results with the health outcomes of each patient, the team found that the patients with reduced blood flow were more likely to have adverse health outcomes including death, heart attack, stroke and heart failure.

The AI technique was therefore shown for the first time to be able to predict which patients might die or suffer major adverse events, better than a doctor could on their own with traditional approaches.

Professor James Moon (UCL Institute of Cardiovascular Science and Barts Health NHS Trust) said: "Artificial intelligence is moving out of the computer labs and into the real world of healthcare, carrying out some tasks better than doctors could do alone. We have tried to measure blood flow manually before, but it is tedious and time-consuming, taking doctors away from where they are needed most, with their patients."

Dr Kristopher Knott (UCL Institute of Cardiovascular Science and Barts Health NHS Trust) added: "The predictive power and reliability of the AI was impressive and easy to implement within a patient's routine care. The calculations were happening as the patients were being scanned, and the results were immediately delivered to doctors. As poor blood flow is treatable, these better predictions ultimately lead to better patient care, as well as giving us new insights into how the heart works."

Dr Peter Kellman from the National Institutes of Health (NIH) in the US, who working with Dr Hui Xue at the NIH, developed the automated AI techniques to analyse the images that were used in the study, said: "This study demonstrates the growing potential of artificial intelligence-assisted imaging technology to improve the detection of heart disease and may move clinicians closer to a precision medicine approach to optimize patient care. We hope that this imaging approach can save lives in the future."

Credit: 
University College London

How social media makes breakups that much worse

Imagine flipping through your Facebook News Feed first thing in the morning and spotting a notification that your ex is now "in a relationship."

Or maybe the Memories feature shows a photo from that beach vacation you took together last year. Or your ex-lover's new lover's mom shows up under People You May Know.

Scenarios like these are real and not uncommon, according to a new University of Colorado Boulder study exploring how breaking up is even harder to do in the digital age.

"Before social media, break-ups still sucked, but it was much easier to get distance from the person," said Anthony Pinter, a doctoral student in the information science department and lead author of the study published in the journal Proceedings of the ACM (Association for Computing Machinery)."It can make it almost impossible to move on if you are constantly being bombarded with reminders in different places online."

The research team recruited participants who had experienced an upsetting encounter online involving a break-up within the past 18 months and interviewed them for over an hour.

Among 19 who underwent in-depth interviews, a disturbing trend emerged: Even when people took every measure they saw possible to remove their exes from their online lives, social media returned them - often multiple times a day.

"A lot of people make the assumption that they can just unfriend their ex or unfollow them and they are not going to have to deal with this anymore," said Pinter. "Our work shows that this is not the case."

News Feed, the primary interface that opens when one launches Facebook, was a major source of distress, delivering news of ex-lovers announcing they were in a new relationship. In one case, a participant noticed his roommate had already "liked" his ex's post. He was the last of his friends to know.

Memories, which revives posts from years' past, was equally heart-rending, with one participant recalling how a sweet years-old message from his ex-wife popped up out of nowhere delivering an "emotional wallop."

Many shared stories of encountering exes via their comments in shared spaces, such as groups or mutual friends' pictures.

"In real life, you get to decide who gets the cat and who gets the couch, but online it's a lot harder to determine who gets this picture or who gets this group," said Pinter.

Take A Break works - for some

In 2015, Facebook launched the Take A Break feature, which detects when a user switches from "in a relationship" to "single" and asks if they want the platform to hide that person's activities. But people like Pinter, who don't use the Relationship Status tool, never get such an offer.

"Facebook doesn't know we broke up because Facebook never knew we were in a relationship," he said.

Even when someone unfriends their ex, if a mutual friend posts a picture without tagging them in it, that picture may still flow through their feed.

And even when they blocked their exes entirely some reported that the ex's friends and family would still show up on Facebook as suggestions under People You May Know.

"Am I never going to be free of all this crap online?" asked one exasperated participant.

The research stems from a larger National Science Foundation grant award called Humanizing Algorithms, aimed at identifying and offering solutions for "algorithmic insensitivity."

"Algorithms are really good at seeing patterns in clicks, likes and when things are posted, but there is a whole lot of nuance in how we interact with people socially that they haven't been designed to pick up," said Brubaker.

The authors suggest that such encounters could be minimized if platform designers paid more attention to the "social periphery" - all those people, groups, photos and events that spring up around a connection between two users.

For those wanting to rid their online lives from reminders of love lost, they recommend unfriending, untagging, using Take a Break and blocking while understanding they may not be foolproof.

Your best bet, said Pinter: "Take a break from social media for a while until you are in a better place."

Credit: 
University of Colorado at Boulder

Early treatment for PTSD after a disaster has lasting effects

image: Armen Goenjian, M.D.

Image: 
UCLA Health

In 1988, a 6.9 magnitude earthquake struck near the northern Armenian city of Spitak. The temblor destroyed cities and is estimated to have killed between 25,000 and 35,000 people, many of whom were schoolchildren.

The latest findings from a long-term, UCLA-led study reveal that children who survived the quake and received psychotherapy soon after have experienced health benefits into adulthood.

The findings are particularly relevant today, said Dr. Armen Goenjian, the study's lead author and a researcher at the Jane and Terry Semel Institute for Neuroscience and Human Behavior at UCLA, given the increased frequency and severity of climate-related catastrophes such as hurricanes and wildfires.

This ongoing project is one of the first long-term studies to follow survivors of a natural disaster who experienced post-traumatic stress disorder, or PTSD, more than five years after the event. The research tracks PTSD and depression symptoms in people who received psychotherapy as children, as well as those who did not.

The latest findings, published in the journal Psychological Medicine, also identified factors that contributed to the risk for PTSD and depression among the Spitak quake survivors, including whether their homes were destroyed, the severity of adversity they faced after the earthquake and whether they had chronic medical illnesses after the quake. People who experienced strong social support were less likely to develop PTSD and depression.

"The association of persistent PTSD and depression with chronic medical illnesses points to the need for targeted outreach services across physical and behavioral health systems," said Goenjian, who is also director of the Armenian Relief Society Clinics in Armenia.

The researchers evaluated 164 survivors who were 12 to 14 years old in 1990, about a year and a half year after the earthquake. Of that group, 94 lived in the city of Gumri, which experienced substantial destruction and thousands of deaths. The other 70 lived in Spitak, where the damage was far more severe and there was a higher rate of death.

A few weeks after the initial assessment, mental health workers provided trauma- and grief-focused psychotherapy in some schools in Gumri, but not in others because of a shortage of trained medical staff.

"We were comparing two devastated cities that had different levels of post-earthquake adversities," Goenjian said. "People in Spitak, who experienced more destruction, earthquake-related deaths and injuries but experienced fewer post-earthquake adversities, had a better recovery from PTSD and depression than survivors in Gumri."

Researchers interviewed survivors five and 25 years after the earthquake. They found that people from Gumri who received psychotherapy had significantly greater improvements in both their depression and PTSD symptoms. On the 80-point PTSD-Reaction Index, for example, PTSD scores for the Gumri group that received psychotherapy dropped from an average of 44 a year and a half after the earthquake to 31 after 25 years.

PTSD scores for people from Gumri who did not receive treatment declined as well, but not as much: from 43 at one-and-a-half years to 36 after 25 years.

Overall, people from Spitak had more severe PTSD and depression after the earthquake. Because they experienced fewer ongoing challenges, such as shortage of heat, electricity, housing and transportation, they tended to show greater improvements in their PTSD symptoms compared to both Gumri groups. The PTSD symptoms for Spitak survivors fell from 53 at one-and-a-half years to 39 after 25 years.

"The takeaway is that school-based screening of children for post-traumatic stress reactions and depression, along with providing trauma and grief-focused therapy after a major disaster is strongly recommended," Goenjian said.

Credit: 
University of California - Los Angeles Health Sciences

TPU researchers discover how to improve safety of nuclear power plants

Researchers at Tomsk Polytechnic University found a method to increase fuel lifetime by 75%. According to the research team, it will significantly increase safety and reduce the operating cost of nuclear power plants in hard-to-reach areas. The study results were published in Nuclear Engineering and Design.

Previously, a team of researchers from the Russian Federal Nuclear Center - All-Russian Research Institute of Technical Physics, Tomsk Polytechnic University, and the Budker Institute of Nuclear Physics proposed the concept of a thorium hybrid reactor, where high-temperature plasma confined in a long magnetic trap is used to obtain additional neutrons. Unlike operating reactors, the proposed thorium hybrid reactor has moderate power, relatively small size, high operational safety, and low level of radioactive waste.

One of the biggest challenges for the development of remote areas, such as the Far North, is a stable energy supply. According to Tomsk researchers, often the only solution is to use low-power nuclear plants.

However, the reactor refueling, one of the most hazardous and time-consuming procedures in nuclear energy, is a significant problem. "Reduction of refuel frequency will drastically improve operational safety. Furthermore, it reduces transportation costs of fresh fuel or a nuclear power plant to a transshipment site." Vladimir Nesterov, associate professor of the TPU Division for Nuclear-Fuel Cycle, says.

The scientists carried out theoretical calculations proving the possibility of creating a thorium-based nuclear fuel cycle. Thorium is four times as abundant as uranium. Additionally, thorium fuel has a significantly higher regeneration intensity of fissile isotopes necessary for energy production.

"The achieved results can draw the attention of the scientific community to the potential of the thorium nuclear fuel cycle. We demonstrated that the implementation of this cycle in a low-power reactor installation results in increasing of fuel lifetime by 75%," the expert says.

In the future, researchers want to continue experiments in the verified software and carry out thermophysical calculations of low-power reactors operating in the thorium-uranium fuel cycle with subsequent implementation of the developed calculation methods in the educational process.

Credit: 
Tomsk Polytechnic University

Catalyst deposition on fragile chips

image: Yen-Ting Chen at the transmission electron microscope

Image: 
RUB, Kramer

Researchers at the Ruhr-Universität Bochum (RUB) and the University of Duisburg-Essen have developed a new method of depositing catalyst particles to tiny electrodes. It is inexpensive, simple and quick to perform. In order to characterize catalysts and test their potential for various applications, researchers have to fix the particles to electrodes so that they can then be examined, for example, with transmission electron microscopy.

The new method is described by Dr. Tsvetan Tarnev and Professor Wolfgang Schuhmann from the Center for Electrochemistry at RUB with Steffen Cychy and Professor Martin Muhler, RUB Chair of Technical Chemistry, as well as Professor Corina Andronescu, University of Duisburg-Essen, and Dr. Yen-Ting Chen from the Bochum Center for Solvation Science in the journal Angewandte Chemie, published online on 20 January 2020.

Wafer-thin electrodes

In transmission electron microscopy, TEM for short, a thin electron beam is sent through the sample to observe the electrochemical processes taking place at an electrode. In order for the beam to penetrate the structures, all sample components must be very thin. The diameter of the electrode to which the catalyst is applied is therefore only ten micrometers.

Depositing catalyst particles drop by drop

With earlier methods, the catalyst particles were either distributed evenly throughout the sample, i.e. even where they were not needed, or methods were used that could damage the material. Both disadvantages are eliminated with the new method, which is based on scanning electrochemical cell microscopy. The researchers fill a glass capillary with a liquid containing the catalyst particles. They then approach the capillary to the electrode onto which the particles are to be deposited. A tiny drop of the particle liquid hangs at the lower opening of the capillary.

The researchers approach the capillary to the electrode until the drop of liquid comes into contact with the electrode and closes an electrical circuit. This automatically stops the approach, preventing damage to the material. The scientists then retract the capillary, but the drop of liquid remains on the electrode. This step can be repeated as often as required. Finally, the researchers evaporate the solvent so that only the catalyst particles remain, which are now fixed to the electrode.

Suitable for many catalyst materials

"Once the methodology is established, it offers a clean, easy-to-use and variable way of applying and measuring a large number of different catalyst materials stably and reproducibly on liquid cell TEM chips," says Wolfgang Schuhmann.

Credit: 
Ruhr-University Bochum

Scientists reveal catalytic mechanism of lovastatin hydrolase

image: Lovastatin hydrolase PcEST specifically and efficiently catalyzes the conversion of lovastatin to monacolin J, but cannot hydrolyze simvastatin

Image: 
LIANG Yajing

Hyperlipidemia, one of the most common threats to human health, refers to an abnormal increase of cholesterol and/or triglycerides in the blood. One effective method for prevention and treatment of the disease is cholesterol-lowering therapy, such as the drug simvastatin.

Alkaline hydrolysis of lovastatin to produce monacolin J is an intermediate step to obtain simvastatin. Enzymatic synthesis using a specific and efficient lovastatin hydrolase is one of the alternative methods for green production of monacolin J.

Recently, the research team led by Prof. LU Xuefeng from the Qingdao Institute of Bioenergy and Bioprocess Technology (QIBEBT), Chinese Academy of Sciences (CAS), revealed the catalytic mechanism and structure-function relationship of the specific and efficient lovastatin hydrolase PcEST.

It is the first report describing the mechanism and structure-function relationship of lovastatin hydrolase and provides insights about further lovastatin hydrolase screening, engineering, and commercial applications. The results were published in the Journal of Biological Chemistry.

Structure-based biochemical analyses and mutagenesis assays revealed that the Ser-57 (nucleophile)-Tyr-170 (general base)-Lys-60 (general acid) catalytic triad, together with the hydrogen-bond network around the active site and the specific substrate-binding tunnel determine the efficient and specific lovastatin hydrolysis by PcEST.

Furthermore, using structure-guided enzyme engineering, the researchers developed a PcEST variant, D106A, which improved solubility and thermostability, suggesting a promising application of this variant in industrial processes.

Credit: 
Chinese Academy of Sciences Headquarters

Vitamin C may shorten ventilation in critically ill patients

Vitamin C administration shortened the duration of mechanical ventilation in critical care patients, but the effect depended on the severity of illness.

In five controlled trials including 471 patients requiring ventilation for over 10 hours, vitamin C shortened ventilation time on average by 25% according to a meta-analysis published in Journal of Intensive Care.

Vitamin C has numerous biochemical effects. It can influence the cardiovascular system through its involvement in the synthesis of norepinephrine and vasopressin, and energy metabolism through its participation in the synthesis of carnitine. In randomized trials, vitamin C has lowered blood pressure, decreased the incidence of atrial fibrillation and decreased bronchoconstriction. A previous meta-analysis of 12 controlled trials found that vitamin C reduced ICU stay on average by 8%.

Critical care patients often have very low vitamin C plasma levels. In healthy people, 0.1 grams per day of vitamin C is usually sufficient to maintain a normal plasma level. However, much higher doses, in the order of grams per day, are needed for critically ill patients to increase their plasma vitamin C levels to within the normal range. Therefore, high vitamin C doses may be needed to compensate for the increased metabolism in critically ill patients.

Harri Hemilä from the University of Helsinki, Finland, and Elizabeth Chalker from the University of Sydney, Australia, carried out a systematic review of vitamin C for mechanically ventilated critical care patients. They identified 9 relevant controlled trials, and 8 of them were included in the meta-analysis.

On average, vitamin C administration shortened ventilation time by 14%, but the effect of vitamin C depended on the duration of ventilation. Patients who are more seriously ill require longer ventilation than those who are not as sick. Therefore, Hemilä and Chalker hypothesized that the effect of vitamin C might be greater in trials with sicker patients who need longer ventilation.

Vitamin C had no effect when ventilation lasted for 10 hours or less. However, in 5 trials including 471 patients who required ventilation for over 10 hours, dosage of 1 to 6 g/day of vitamin C shortened ventilation time on average by 25%.

"Vitamin C is a safe, low-cost essential nutrient. Given the strong evidence of benefit for more severely ill critical care patients along with the evidence of very low vitamin C levels in such patients, ICU patients may benefit from the administration of vitamin C. Further studies are needed to determine optimal protocols for its administration. Future trials should directly compare different dosage levels," says Dr. Hemilä.

Credit: 
University of Helsinki

KIST unveils the mystery of van der Waals magnets, a material for future semiconductors

image: The team conducted an experiment in which they observed the material while controlling the number of electrons, leading them to discover changes in the properties of FGT. The team proved that the magnetic anisotropy, which describes how the material's magnetic properties change depending on the direction, contributed to such changes.

Image: 
Korea Institute of Science and Technology (KIST)

Drs. Chaun Jang, Jun Woo Choi, and Hyejin Ryu of the Korea Institute of Science and Technology (KIST, President Lee Byung Gwon) have announced that their team at KIST's Center for Spintronics successfully controlled the magnetic properties of FGT (Fe3GeTe2) in a joint research project with Dr. Se Young Park and his team at the Center for Correlated Electron Systems at the Institute for Basic Science (IBS). Fe3GeTe2 has recently attracted attention as a material for next-generation spintronic semiconductors.

*Named by combining the terms "spin" and "electronics," "spintronics" is a new field in electronic engineering that aims to replace conventional silicon semiconductors by utilizing electron spin, a quantum property of electrons.

Van der Waals materials, also known as two-dimensional (2D) materials, are layered materials composed of planes that are attached to each other via a weak van der Waals interaction. These include various materials such as graphene and molybdenum disulfide. When combined with other 2D materials, they can create new materials that show previously undiscovered properties. This is why 2D materials, which have a variety of properties, such as superconductivity, semi-conductivity, and metallicity have been the subject of so many studies.

In 2017, 2D van der Waals materials that show magnetic properties were discovered, stimulating research projects and studies all around the world. However, most van der Waals magnetic materials have some constraints in terms of spintronics application because of their low Curie temperature** and high coercivity,*** making them unsuitable for use in certain devices.

** Curie temperature: a transition temperature point where a ferromagnetic material changes to a paramagnetic one or vice versa.

*** Coercivity: the intensity of magnetic field required to reduce the magnetic flux density of a ferromagnetic material to zero after the magnetism of that material has been saturated.

A number of studies have been done on FGT, a recently discovered van der Waals material with a layered structure. The joint KIST-IBS research team discovered an efficient scheme for controlling the properties of FGT. The team conducted an experiment in which they observed the material while controlling the number of electrons, leading them to discover changes in the properties of FGT. The team proved that the magnetic anisotropy,**** which describes how the material's magnetic properties change depending on the direction, contributed to such changes.

**** Magnetic anisotropy: This refers to the directional dependence of a material's magnetic properties on a crystallographic or geometric structure. Depending on such structures, a material can have easy or hard magnetization directions.

The research results revealed the origin of the changes in the FGT magnetic properties, thus presenting a possible method of efficiently controlling the properties of 2D magnetic materials. Furthermore, the research team announced that by potentially controlling the properties of single-atom-thick van der Waals magnetic materials, the development of spintronic devices which operate 100-times faster than current silicon-based electronic device, could be accelerated.

Dr. Hyejin Ryu of KIST said, "We started this study to discover the magnetic properties of van der Waals materials and apply such properties to spintronic devices." She added, "Further development of new materials for semiconductors with various properties will be possible through the use of van der Waals magnetic materials and other van der Waals materials based heterostructures."

Credit: 
National Research Council of Science & Technology

Broadband transmission-type coding metasurface for electromagnetic beam forming and scanning

image: Graphically structural illustrations of the broadband 1-bit coding particles and the corresponding transmission amplitude and phase responses.

Image: 
©Science China Press

Due to their excellent performance in manipulating electromagnetic (EM) waves freely and flexibly, metasurfaces have been widely investigated since the beginning of the 21st century. However, with the rapid development of digital information technology, the traditional analog metasurfaces with continuous phase control become difficult to control the digital information. In 2014, digital coding and programmable metasurfaces were proposed, which make it possible to manipulate the EM waves from the digital aspect, building up a bridge between the information science and physical metasurface. Recently, some researchers have proposed a novel broadband transmission-type 1-bit digital coding metasurface, and analyzed its manipulations on EM far-field radiating functionalities.

The related paper entitled "Broadband transmission-type 1-bit coding metasurface for beam forming and scanning" was published in SCIENCE CHINA Physics, Mechanics & Astronomy, in which RuiYuan Wu and Prof. TieJun Cui from Southeast University are the first author and the corresponding author, respectively.

For a long time, improving the working bandwidth is always a big challenge of metasurfaces. Generally, wideband design of reflection-type metasurface is easier to implement because only the phase response should be considered, and the reflection amplitude remains above 95% due to the presence of metallic ground. In contrast, in the design of transmission-type metasurface, not only the phase response needs to satisfy the requirements for digital coding schemes, but also a high transmission amplitude is demanded. The realization of both conditions relies on strong resonance of the digital particle, which is very difficult to maintain in a wide band.

To overcome this difficulty, the authors adopted a multi-layer transmission-type structure, as shown in Figure 1(a), to implement the digital particles. The structure is composed of four identical metallic square patches (see Figure 1(b)), a metallic cross-shaped slot layer (see Figure 1(c)), and dielectric substrates. By adjusting the sizes of square patches, the phase responses of transmitted EM waves would be changed accordingly. After optimizations between the higher transmittance and sufficient phase difference of 1-bit coding, two digital particles with different geometries were designed to represent the digital states '0' and '1'. As displayed in Figures 1(d) and 1(e), the phase difference between the two particles was kept near 180° to ensure the 1-bit coding effect in the wideband range of 8.1-12.5 GHz with high transmittance, corresponding to an over 40% relative bandwidth.

Based on the broadband characteristics of the digital particles, a digital coding metasurface design was firstly constructed to achieve high-directional beam forming in the wide band, in which the sidelobe levels were below ?10dB, as shown in Figures 2(a) and 2(b). Furthermore, the digital particles on the metasurface were encoded as the sequence of '010101...' to radiate two symmetrical beams, where the deflecting angles could scan continuously in one-dimensional range with the change of working frequency. The scanning angle is more than 20°, as shown in Figures 2(c) and 2(d). The proposed design breaks the current bandwidth limit in the transmission-type coding metasurfaces, indicating wide application potentials in radar and wireless communication systems.

Credit: 
Science China Press

ESO telescope sees surface of dim Betelgeuse

image: The red supergiant star Betelgeuse, in the constellation of Orion, has been undergoing unprecedented dimming. This stunning image of the star's surface, taken with the SPHERE instrument on ESO's Very Large Telescope late last year, is among the first observations to come out of an observing campaign aimed at understanding why the star is becoming fainter. When compared with the image taken in January 2019, it shows how much the star has faded and how its apparent shape has changed.

Image: 
ESO/M. Montargès et al.

Using ESO's Very Large Telescope (VLT), astronomers have captured the unprecedented dimming of Betelgeuse, a red supergiant star in the constellation of Orion. The stunning new images of the star's surface show not only the fading red supergiant but also how its apparent shape is changing.

Betelgeuse has been a beacon in the night sky for stellar observers but it began to dim late last year. At the time of writing Betelgeuse is at about 36% of its normal brightness, a change noticeable even to the naked eye. Astronomy enthusiasts and scientists alike were excitedly hoping to find out more about this unprecedented dimming.

A team led by Miguel Montargès, an astronomer at KU Leuven in Belgium, has been observing the star with ESO's Very Large Telescope since December, aiming to understand why it's becoming fainter. Among the first observations to come out of their campaign is a stunning new image of Betelgeuse's surface, taken late last year with the SPHERE instrument.

The team also happened to observe the star with SPHERE in January 2019, before it began to dim, giving us a before-and-after picture of Betelgeuse. Taken in visible light, the images highlight the changes occurring to the star both in brightness and in apparent shape.

Many astronomy enthusiasts wondered if Betelgeuse's dimming meant it was about to explode. Like all red supergiants, Betelgeuse will one day go supernova, but astronomers don't think this is happening now. They have other hypotheses to explain what exactly is causing the shift in shape and brightness seen in the SPHERE images. "The two scenarios we are working on are a cooling of the surface due to exceptional stellar activity or dust ejection towards us," says Montargès [1]. "Of course, our knowledge of red supergiants remains incomplete, and this is still a work in progress, so a surprise can still happen."

Montargès and his team needed the VLT at Cerro Paranal in Chile to study the star, which is over 700 light-years away, and gather clues on its dimming. "ESO's Paranal Observatory is one of few facilities capable of imaging the surface of Betelgeuse," he says. Instruments on ESO's VLT allow observations from the visible to the mid-infrared, meaning astronomers can see both the surface of Betelgeuse and the material around it. "This is the only way we can understand what is happening to the star."

Another new image, obtained with the VISIR instrument on the VLT, shows the infrared light being emitted by the dust surrounding Betelgeuse in December 2019. These observations were made by a team led by Pierre Kervella from the Observatory of Paris in France who explained that the wavelength of the image is similar to that detected by heat cameras. The clouds of dust, which resemble flames in the VISIR image, are formed when the star sheds its material back into space.

"The phrase 'we are all made of stardust' is one we hear a lot in popular astronomy, but where exactly does this dust come from?" says Emily Cannon, a PhD student at KU Leuven working with SPHERE images of red supergiants. "Over their lifetimes, red supergiants like Betelgeuse create and eject vast amounts of material even before they explode as supernovae. Modern technology has enabled us to study these objects, hundreds of light-years away, in unprecedented detail giving us the opportunity to unravel the mystery of what triggers their mass loss."

Credit: 
ESO

Are all sources of carbohydrates created equal?

UNIVERSITY PARK, PA -Potatoes are often equated with refined grains due to their carbohydrate content. Yet, potatoes contain fiber, resistant starch, and key micronutrients that Americans need more of in their diet. A randomized crossover study that included 50 generally healthy adults directly compared the nutrient quality and impact on cardiometabolic risk factors of non-fried potatoes to refined grains. The study was conducted by researchers at Penn State and was recently published in the British Journal of Nutrition. Its findings demonstrate that potatoes can support a healthy diet; daily intake of one serving of non-fried potato did not affect markers of glycemia and was associated with better diet quality compared to refined grains.

"Clinical studies are important to contextualize observational findings," say Penn State Researchers. "Some epidemiologic studies have suggested an association between potato intake and increased risk of cardiometabolic diseases. However, the context around how potatoes are eaten, such as their preparation method or other foods eaten alongside them, may be important factors in explaining why our clinical trial findings differ from those of observational studies."

Participants were randomly assigned to eat either a refined grain side dish (e.g., pasta, rice, white bread) or a steamed/baked potato side dish of equal calories each day with a main meal for four weeks. After a two-week break, the same individuals ate the opposite side dish with a main meal for another four weeks. Aside from being required to consume either a potato or refined grain side dish, no other dietary restrictions were placed on participants. Several markers of cardiometabolic risk were measured including plasma glucose, serum insulin, cholesterol and other blood lipids, blood pressure, and participants' reported diet quality.

While neither refined grains nor potatoes impacted cardiometabolic risk factors, participants' potassium and fiber intake, total vegetable and starchy vegetable intake and Healthy Index Score - a sign of how well people are following the Dietary Guidelines for Americans were higher when they ate potatoes, compared to refined grains.

"Americans eat too many refined carbohydrates and not enough whole grains or starchy vegetables, according to the 2015-2020 Dietary Guidelines for Americans. Our study findings suggest that eating 1 serving of non-fried potatoes in place of refined grains can help individuals meet more dietary recommendations."

The study had several strengths, such as the randomized crossover design and isocaloric dietary substitution. All dishes were prepared in a healthy way with limited added fat or sodium. However, the researchers noted a few limitations: the need for larger sample sizes, a longer intervention time and controlled dietary intake rather than self-reported diets. "It is important to replicate our findings in other groups, such as those at higher risk of cardiometabolic disease. These findings apply to the generally healthy population."

Credit: 
FoodMinds LLC