Culture

COVID-19 may trigger new diabetes, experts warn

Emerging evidence suggests that COVID-19 may actually trigger the onset of diabetes in healthy people and also cause severe complications of pre-existing diabetes.

A letter published today in the New England Journal of Medicine and signed by an international group of 17 leading diabetes experts involved in the CoviDiab Registry project, a collaborative international research initiative, announces the establishment of a Global Registry of new cases of diabetes in patients with COVID-19.

The Registry aims to understand the extent and the characteristics of the manifestations of diabetes in patients with COVID-19, and the best strategies for the treatment and monitoring of affected patients, during and after the pandemic.

Clinical observations so far show a bi-directional relationship between COVID-19 and diabetes. On the one hand, diabetes is associated with increased risk of COVID-19 severity and mortality. Between 20 and 30% of patients who died with COVID-19 have been reported to have diabetes. On the other hand, new-onset diabetes and atypical metabolic complications of pre-existing diabetes, including life-threatening ones, have been observed in people with COVID-19.

It is still unclear how SARS-Cov-2, the virus that causes COVID-19, impacts diabetes. Previous research has shown that ACE-2, the protein that binds to SARS-Cov-2 allowing the virus to enter human cells, is not only located in the lungs but also in organs and tissues involved in glucose metabolism such as the pancreas, the small intestine, the fat tissue, the liver and the kidney. Researchers hypothesise that by entering these tissues, the virus may cause multiple and complex dysfunctions of glucose metabolism. It has also been known for many years that virus infections can precipitate type 1 diabetes.

Francesco Rubino, Professor of Metabolic Surgery at King's College London and co-lead investigator of the CoviDiab Registry project, said: "Diabetes is one of the most prevalent chronic diseases and we are now realizing the consequences of the inevitable clash between two pandemics. Given the short period of human contact with this new coronavirus, the exact mechanism by which the virus influences glucose metabolism is still unclear and we don't know whether the acute manifestation of diabetes in these patients represent classic type 1, type 2 or possibly a new form of diabetes".

Paul Zimmet, Professor of Diabetes at Monash University in Melbourne, Honorary President of the International Diabetes Federation and co-lead investigator in the CoviDiab Registry project said: "We don't yet know the magnitude of the new onset diabetes in COVID-19 and if it will persist or resolve after the infection; and if so, whether or not or COVID-19 increases risk of future diabetes. By establishing this Global Registry, we are calling on the international medical community to rapidly share relevant clinical observations that can help answer these questions".

Stephanie Amiel, Professor of Diabetes Research at King's College London and a co-investigator of the CoviDiab Registry project said: "The registry focuses on routinely collected clinical data that will help us examine insulin secretory capacity, insulin resistance and autoimmune antibody status to understand how COVID-19 related diabetes develops, its natural history and best management. Studying COVID-19-related diabetes may uncover novel mechanisms of disease."

Credit: 
King's College London

A* model

Like most galaxies, the Milky Way hosts a supermassive black hole at its center. Called Sagittarius A*, the object has captured astronomers' curiosity for decades. And now there is an effort to image it directly.

Catching a good photo of the celestial beast will require a better understanding of what's going on around it, which has proved challenging due to the vastly different scales involved. "That's the biggest thing we had to overcome," said Sean Ressler, a postdoctoral researcher at UC Santa Barbara's Kavli Institute for Theoretical Physics (KITP), who just published a paper in the Astrophysical Journal Letters, investigating the magnetic properties of the accretion disk surrounding Sagittarius A*.

In the study, Ressler, fellow KITP postdoc Chris White and their colleagues, Eliot Quataert of UC Berkeley and James Stone at the Institute for Advanced Study, sought to determine whether the black hole's magnetic field, which is generated by in-falling matter, can build up to the point where it briefly chokes off this flow, a condition scientists call magnetically arrested. Answering this would require simulating the system all the way out to the closest orbiting stars.

The system in question spans seven orders of magnitude. The black hole's event horizon, or envelope of no return, reaches around 4 to 8 million miles from its center. Meanwhile, the stars orbit around 20 trillion miles away, or about as far as the sun's nearest neighboring star.

"So you have to track the matter falling in from this very large scale all the way down to this very small scale," said Ressler. "And doing that in a single simulation is incredibly challenging, to the point that it's impossible." The smallest events proceed on timescales of seconds while the largest phenomena play out over thousands of years.

This paper connects small scale simulations, which are mostly theory-based, with large-scale simulations that can be constrained by actual observations. To achieve this, Ressler divided the task between models at three overlapping scales.

The first simulation relied on data from Sagittarius A*'s surrounding stars. Fortunately, the black hole's activity is dominated by just 30 or so Wolf-Rayet stars, which blow off tremendous amounts of material. "The mass loss from just one of the stars is larger than the total amount of stuff falling into the black hole during the same time," Ressler said. The stars spend only around 100,000 years in this dynamic phase before transitioning into a more stable stage of life.

Using observational data, Ressler simulated the orbits of these stars over the course of about a thousand years. He then used the results as the starting point for a simulation of medium-range distances, which evolve over shorter time scales. He repeated this for a simulation down to the very edge of the event horizon, where activity takes place in matters of seconds. Rather than stitching together hard overlaps, this approach allowed Ressler to fade the results of the three simulations into one another.

"These are really the first models of the accretion at the smallest scales in [Sagittarius] A* that take into account the reality of the supply of matter coming from orbiting stars," said coauthor White.

And the technique worked splendidly. "It went beyond my expectations," Ressler remarked.

The results indicated that Sagittarius A* can become magnetically arrested. This came as a surprise to the team, since the Milky Way has a relatively quiet galactic center. Usually, magnetically arrested black holes have high-energy jets shooting particles away at relativistic speeds. But so far scientists have seen little evidence for jets around Sagittarius A*.

"The other ingredient that helps create jets is a rapidly spinning black hole," said White, "so this may be telling us something about the spin of Sagittarius A*."

Unfortunately, black hole spin is difficult to determine. Ressler modeled Sagittarius A* as a stationary object. "We don't know anything about the spin," he said. "There's a possibility that it's actually just not spinning."

Ressler and White next plan to model a spinning back hole, which is much more challenging. It immediately introduces a host of new variables, including spin rate, direction and tilt relative to the accretion disc. They will use data from the European Southern Observatory's GRAVITY interferometer to guide these decisions.

The team used the simulations to create images that can be compared to actual observations of the black hole. Scientists at the Event Horizon Telescope collaboration -- which made headlines in April 2019 with the first direct image of a black hole -- have already reached out requesting the simulation data in order to supplement their effort to photograph Sagittarius A*.

The Event Horizon Telescope effectively takes a time average of its observations, which results in a blurry image. This was less of an issue when the observatory had their sights on Messier 87*, because it is around 1,000 times larger than Sagittarius A*, so it changes around 1,000 times more slowly.

"It's like taking a picture of a sloth versus taking a picture of a hummingbird," Ressler explained. Their current and future results should help the consortium interpret their data on our own galactic center.

Ressler's results are a big step forward in our understanding of the activity at the center of the Milky Way. "This is the first time that Sagittarius A* has been modeled over such a large range in radii in 3D simulations, and the first event horizon-scale simulations to employ direct observations of the Wolf-Rayet stars," Ressler said.

Credit: 
University of California - Santa Barbara

Study links elevated levels of advanced glycation end-products (AGEs) with breast cancer risk

image: Dr. David Turner cooks with his kids to help them develop long term lifestyle habits and healthy eating behaviors. Turner published a study linking high levels of AGEs from processed food in the body and breast cancer risk.

Image: 
MUSC HCC

Hollings Cancer Center researchers at the Medical University of South Carolina (MUSC) and colleagues assessed the connection between dietary advanced glycation end products (AGEs) and breast cancer risk in a study first published online March 2020 in Cancer Prevention Research.

It supports an increasingly evident link between high levels of AGEs in the body and cancer risk, said principal investigator David Turner, Ph.D., who worked with colleagues Susan Steck, Ph.D., with the University of South Carolina, and Lindsay Peterson, M.D., with Washington University School of Medicine.

The study was part of a larger decade-long prostate, lung, colorectal and ovarian cancer screening trial (PLCO) designed and sponsored by the National Cancer Institute. It included over 78,000 women between the ages of 55 and 74 years who were cancer free at the start of the study. The women completed a food frequency questionnaire at the beginning and again at five years into the study. After an average of 11 ½ years, 1,592 of the women were diagnosed with breast cancer. When the intake of high-AGE food was assessed, based on the questionnaires, increased AGE intake via the diet was associated with an increased risk of in situ and hormone receptor positive breast cancers.

Advanced glycation end products are proteins and lipids (fats) that go through a chemical alteration called glycation when they are exposed to sugars. This process occurs naturally in the body. However, processed foods and foods cooked at high temperatures are extremely high in AGEs, which can lead to a dangerous overabundance in the body.

Turner said AGEs are involved in nearly every chronic disease, in some way. "The study of AGEs in cancer is just starting to get traction. The presence of AGEs has been known for at least 100 years, but the research has been challenging. In order to determine how they work, their mechanism of action, researchers first have to determine a role in various diseases."

Turner said this study is important because it adds to the evidence between high levels of AGEs in the body and cancer risk. Turner and his collaborators are promoting the connection between AGEs and lifestyle choices to help the public make better food choices.

This will become an even more popular area of study as researchers employ new tools to help study AGEs. "A novel device, the AGE reader, is about to change how we look at AGEs in the clinic," Turner said. The AGE reader, made by Diagnoptics, is an easy to use noninvasive device where someone rests their forearm for just 12 seconds. It uses light at certain wavelengths to excite AGE autofluorescence in the human skin tissue.

"This machine actually measures glow from some of the AGEs. The more AGEs that are in the skin, the higher the glow," explained Turner.

While the AGE reader has been used to show strong correlations between AGE levels and Type 2 diabetes, cardiovascular disease and even mortality, Turner is using a cancer center support grant to validate further the AGE reader for use in cancer patients. He and his colleagues plan to investigate whether pigmentation in the skin skews the reading and use the reader as part of a growing community outreach program.

Since a link between AGEs and breast cancer has been shown, the ultimate goal is to test all Hollings Cancer Center patients who are interested at each visit, Turner said. This will provide a huge amount of data about the link between AGEs and a wide variety of cancers. Turner and his collaborators expect that future multicenter grants will come out of this project.

While the connection between high AGE levels and cancer risk might be disconcerting, research is also being done to determine if there is a way to reverse the detrimental effects of AGEs.

Bradley Krisanits, a Ph.D. student in Turner's lab, said that preliminarily, they have seen that physical activity reduces the amount of AGEs in the circulation.

"In our prostate cancer models, we see that physical activity counteracts prostate cancer progression in mice fed a high-AGE diet. This may be occurring due to a reduction in AGEs and changes in the immune system that we need to study more."

Turner hopes that by educating people about AGEs, they can make informed lifestyle decisions and lower their risks for chronic diseases. The top three things that a person can do is learn what AGEs are, avoid processed foods and think about how you cook your food in order to make changes to avoid the highest AGE-inducing cooking methods such as frying, grilling and broiling.

"AGEs build up in a cumulative way. Fats, sugars, everything that is bad for you leads to the accumulation of AGEs. One of our goals at Hollings is to reach out to the community to encourage the public to make healthier choices. Just making small changes in your diet can have a big effect."

Credit: 
Medical University of South Carolina

As rare animals disappear, scientist faces 'ecological grief'

Five years before the novel coronavirus ran rampant around the world, saiga antelopes from the steppes of Eurasia experienced their own epidemic.

Millions of these grazing animals--easily recognizable by their oversized snouts--once migrated across what is today Kazakhstan, Mongolia, Georgia and more.

But then, over the span of three weeks in 2015, nearly 200,000, or two-thirds of their existing population, sickened and died from a bacterial infection. Today, the a little more than 100,000 saiga are hanging onto survival in a few pockets of Eurasia.

The decline, and uncertain fate, of the saiga is a story that resonates with Joanna Lambert. She's a conservation biologist at the University of Colorado Boulder and a coauthor of a paper published this week in the journal Frontiers in Ecology and Evolution. The study explores the current state of ungulates, or hoofed animals like the saiga, in the western U.S. and around the world.

Lambert, who has studied ecological communities in both North America and Equatorial Africa, explained that many of these creatures aren't well-known outside of their home regions. But when these animals disappear, entire ecosystems can reshuffle, occasionally beyond recognition.

"We're losing these animals without people ever knowing they were there in the first place," said Lambert, a professor in the Program of Environmental Studies at CU Boulder.

For the researcher, the study's publication marks an opportunity to reflect on how she stays hopeful even amid tremendous losses--and how to talk about the natural world during a period of unprecedented social upheaval.

"I tell my students, 'I have to give you the facts. This is the world you're growing up in, but don't let that paralyze you,'" Lambert said.

Unsung species

The new research was led by Joel Berger of Colorado State University and also included scientists from Bhutan, Argentina and Chile.

The team decided to look at ungulates because--with a few exceptions like rhinos and elephants--they don't usually pop up in brochures for conservation organizations. But, Lambart said, they're still in trouble: Huemel, for example, once roamed across the Patagonia region of South America. Today, a little more than 1,000 of these fluffy deer still live in the wild. The tamaraw, a pint-sized buffalo from the Philippines, is down to just a few hundred individuals.

"The whole world knows the stories of pandas and mountain gorillas, but there are untold numbers of unsung species that come and go without the world's attention," she said.

Their cases also show just how complicated conservation can be.

Lambert has spent years trekking the grasslands and forests of Yellowstone National Park to study wildlife. After federal officials killed all the park's wolves in the 1940s, elk herds there began to multiply--big time. Head counts for these herbivores surged from a few thousand individuals to tens of thousands, and they devoured once-abundant plants like cottonwood and willow trees.

"When you pull one species out of its community, or if you add a new one in, the entire assembly changes," Lambert said. "That has been the history of what humans have done on the planet."

When the park brought wolves back in the 1990s, and elk numbers dropped back down, something unexpected happened: beavers, which had also disappeared from Yellowstone, began reappearing, too. The furry swimmers, it turns out, depend on those same tree species to build their dams.

"In many cases, we don't know what rules these ecosystems followed in the past," she said. "Even when we do know, it doesn't matter because we now have this added element of human tinkering."

Ecological grief

Lambert has also struggled to keep going as a conservation biologist as the wilds around her field sites in Africa and North America dwindled, then vanished entirely.

"As I returned each year from the field, it was taking me longer and longer to recover from a sort of existential depression," she said. "I realized that I have been profoundly impacted by the losses I've seen."

Many of Lambert's students feel similarly hopeless, a phenomenon that psychologists call "ecological grief." She tells them to focus on the success stories, however rare they are. Protected areas like Yellowstone have saved countless animals from extinction and have given others like wolves new chances at survival. Lambert is also providing scientific guidance around proposals to return wolves to Colorado.

And there are still a lot of animals out there--including the few remaining herds of big-nosed saiga.

"We need to fight like hell to keep all that," she said.

Credit: 
University of Colorado at Boulder

State-level R&D tax credits spur growth of new businesses

Here's some good news for U.S. states trying to spur an economic recovery in the years ahead: The R&D tax credit has a significant effect on entrepreneurship, according to a new study led by an MIT professor.

Moreover, the study finds a striking contrast between two types of tax credits. While the R&D tax credit fuels high-quality new-firm growth, the state-level investment tax credit, which supports general business needs, actually has a slightly negative economic effect on that kind of innovative activity.

The underlying reason for the difference, the study's authors believe, is that R&D tax credits, which are for innovative research and development, help ambitious startup firms flourish. But when states are simply granting investment tax credits, allowing long-established firms to expand, they are supporting businesses with less growth ahead of them, and thus not placing winning policy bets over time.

"What we see is an improvement in the environment for entrepreneurship in general, specifically for those growth-oriented startups that ultimately are the engine of business dynamism," says MIT economist Scott Stern, co-author of a newly published paper detailing the study's results.

"States that introduced R&D tax credits set the table for increased entrepreneurship," says Catherine Fazio MBA '14, a co-author of the study and research affiliate at the MIT Lab for Innovation Science and Policy.

Specifically, the study finds that -- other things being equal, and accounting for existing growth trends -- areas introducing R&D tax credits experience a 20 percent rise in high-quality new-firm-formation over a 10-year period, whereas areas using investment tax credits see a 12 percent drop in high-quality firm growth, also over a 10-year period.

"The investment tax credit arguably reinforces the strength of big business in these states, and that might create a barrier to entry for new firms," Stern explains. "It might harm entrepreneurship. But the R&D tax credit facilitates knowledge, facilitates science, facilitates exactly the sorts of things that can spur new ideas, and spurring new ideas is the key for our entrepreneurial ecosystem."

Indeed, adds Jorge Guzman MBA '11, PhD '17, a management professor and co-author of the study, "States offering both R&D and investment tax credits in an effort to stimulate high-growth entrepreneurship may actually be offering incentives that work at cross purposes to each other."

The paper, "The Impact of State-Level Research and Development Tax Credits on the Quality and Quantity of Entrepreneurship," appears in the latest issue of Economic Development Quarterly. Fazio is also a lecturer at Boston University's Questrom School of Business; Guzman is an assistant professor at the Columbia Business School of Columbia University; and Stern is the David Sarnoff Professor of Management of Technology at the MIT Sloan School of Management.

Third year is the take-off point

The R&D tax credit was introduced in 1981 at the federal level, with states soon adding it to their own policy toolkits. From 1981 through 2006, 32 states have implemented R&D tax credits. At the same time, 20 states granted investment tax credits. Yet no study has specifically examined the impact of state R&D tax credits on new firms.

"A classical question that had previously resisted empirical scrutiny was the impact of the state-level R&D tax credit on entrepreneurship," Stern says. Moreover, he adds, it's reasonable to question how effective the policy might be: "Growth-oriented startups don't pay a lot of taxes upfront, so it's not clear how salient taxes would be for entrepreneurship."

To conduct the study, the researchers used a unique database they have created: the Startup Cartography Project, which features about 30 years of data on business formation and startup quality -- including data showing the likelihood of success for new firms based on their key characteristics. (For instance, firms that seek intellectual property protection, or are organized to attract further equity financing, are more likely to succeed).

The scholars also used the Upjohn Panel Data on Incentives and Taxes, which contain detailed records of state tax policies, collected by Timothy Bartik, a senior economist at the W.J. Upjohn Institute for Employment Research.

By evaluating tax policy changes alongside changes in business activity, the researchers could assess the state-level effects of the R&D tax credit. Crucially, the study not only tallies firm formation, but also analyzes the quality of those startup firms and the development of local innovation ecosystems, to measure the full impact of the policy changes.

Ultimately the study examined 25 states where the two data sets overlapped thoroughly from 1990 to 2010, with the R&D tax credit available to companies in counties within these states 46 percent of the time.

By examining before-and-after data around the introduction of the state-level R&D tax credit, the researchers concluded that the policy change created more startup activity.

Intriguingly, the study found that the third year after the introduction of the R&D tax credit is the real take-off point for entrepreneurship in a state, with a roughly 2 percent annual growth in high-quality firm formation from that year through the 14th year after the policy change.

"It takes a few years for that impact to make its way through the system," Stern says. "If you expect a one-year payoff from this, that's too short."

To be clear, many large businesses have long featured active R&D arms, and may also benefit from the state-level R&D tax credit. Indeed, Stern says, the current study was partly motivated by policymakers' past focus on the benefits of tax credits for major corporations. Those may be real enough, but they are not the sole area of influence for the R&D tax credit.

"The policy discussion has mostly focused on lowering the burden on, and providing incentives for, investment for big business," Stern says. "Right now Amazon, for example, takes a very large R&D tax credit. And it can say, 'Do you like your Amazon Echo or Alexa and your crowdsourcing services? Well, all that came from our R&D.'" At the same time, Stern adds, "If the main policy rationale has always been to help big business, over time, [people in] public policy have discussed if it also helps startups." The study now brings data to that conversation.

The long road ahead

The current study started well before the Covid-19 crisis, which has led to a massive rise in unemployment and severe problems and uncertainty for small businesses. To be clear, Stern observes, any reasonable recovery will require policy tools that extensively reach long-existing types of firms, rather than just depending on new growth.

"In this particular economic crisis, and public health crisis, we're going to need to be restoring Main Street in a really important way," Stern says. That means helping local restaurants, retail stores, and many other traditional small businesses, Stern emphasizes. As part of his ongoing work, he is now examining new business registrations of all kinds this spring, in the midst of the pandemic.

Still, the damage from the recovery has been so vast that efforts to bounce back must take multiple tracks -- including incentives for innovative firms that might fill business needs created by the Covid-19 crisis.

"While no one can predict the future, we know that the actual economic recovery is going to depend on restoring business dynamism," Stern says. "And that means we need to start getting new entrants, and make new entrepreneurship easier and better."

States willing to give R&D tax relief to firms could well see the tactic spurring on part of a larger, eventual recovery.

"The R&D tax credit is one of the few innovation policy instruments that at relatively low administrative cost, can make a big difference for spurring innovation and entrepreneurship within a region," Stern emphasizes. "You have to be committed to it. You have got to be patient. But it does pay off."

Credit: 
Massachusetts Institute of Technology

Scientists report heavy ion transfer in charged vdW cluster for the first time

image: Schematic diagram of heavy ion transfer process through tunneling.

Image: 
IMP

Since the discovery of the double helix form of DNA and the hypothesis of DNA mutation induced by proton transfer more than 50 years ago, it has been recognized that proton transfer is crucial to many chemical and biological processes.

As these processes are known to be relevant to biophysics and radiation therapy, a question arises whether a massive ion could be transferred in biochemical processes and lead to fragmentation. Specifically, in a complex bio-environment, does heavy ion transfer play a role?

Published in Nature Communications on June 12, a study reporting a new channel involving heavy N+ ion transfer observed in a charged Van der Waals cluster helps address the above questions.

The study was conducted by a team of researchers from the Institute of Modern Physics (IMP) of the Chinese Academy of Sciences (CAS), the Institute of Applied Physics and Computational Mathematics, and Centre de recherche sur les Ions, les MAtériaux et la Photonique (CIMAP) in France.

"Small van der Waals systems can be employed as experimentally feasible model systems," said Prof. ZHU Xiaolong of IMP, one of the first authors. Van der Waals (vdW) clusters are weakly bound atomic/molecular systems. "They are common in nature and important for understanding microenvironmental chemical phenomena in biosystems."

Interatomic Coulombic decay is a typical process that demonstrates the energy and charge transfer over a large distance between atomic components in a cluster and results in fragmentation, proving that forbidden channels for isolated atoms/molecules may be opened due to the presence of neighboring atoms. Here, the energy and charge transfers are mediated by virtual photon or Coulomb interaction.

In hydrogen-bond clusters, the proton transfer process plays an important role as well. It involves mass and charge migration over large distances within the cluster and results in fragmentation of the latter. Nevertheless, in previous research, this kind of transfer process was limited to hydrogen-bond clusters.

In the current work, the scientists used the neutral vdW cluster N2 Ar as a target in collisions with 1 MeV Ne8+ ions to produce the doubly charged cluster (N2Ar)2+.

Surprisingly, an exotic heavy N+ ion transfer channel (N2 Ar)2+ → N+ + NAr+ was observed. It is the first time that such a heavy-ion transfer process in a vdW cluster has ever been reported and the consequent formation of NAr+ is a novel scenario.

According to the study, this channel originates from the dissociation of the parent doubly charged cluster N22+ Ar generated by the "N2-site" two-electron loss process.

Theoretical calculations show that the polarization interactions between Ar and N22+ lead first to an isomerization process of N22+ Ar from its initial T-shape to a linear shape (N-N-Ar).

In addition, the neighboring neutral Ar atom decreases the N22+ barrier height and width, resulting in significantly shorter lifetimes for the metastable electronic state.

Consequently, the breakup of the covalent N+-N+ bond, the tunneling out of the N+ ion from the N22+ potential well, as well as the formation of the N-Ar+ bound system take place almost simultaneously. Then the Coulomb explosion starts between N+ and NAr+ ion pairs.

"This new mechanism might be general for molecular dimer ions in the presence of a neighboring atom, and be of potential importance in understanding the microdynamics of biological systems," said Prof. MA Xinwen from IMP, one of the corresponding authors of this paper. "For example, it may help us understand the micromechanism of cancer therapy by heavy ion irradiation."

Credit: 
Chinese Academy of Sciences Headquarters

Health profession: Social interdependence in active learning evaluated by Delphi procedure

Physicians must be competent collaborators with team members in order to practice medicine effectively. Health professional students have limited opportunities to work and learn together during the course of their medical education. Not only is it important for students to acquire prodigious knowledge, they must also learn how to collaborate well, and the results of their efforts must be evaluated fairly to measure the effectiveness of this collaborative, active learning.

Assistant professor Ikuo Shimizu of Shinshu University School of Medicine and collaborators used a modified Delphi procedure to develop the content validity of the students social interdependence in collaborative learning. Teamwork and collaboration are common goals in all of higher education and the workplace, but not always properly evaluated. It is crucial to come up with a fair assessment that can be utilized for further improvement of techniques and procedures.

Although it was difficult to recruit a diverse panel from abroad, assistant professor Shimizu was able to form a panel of medical students, education experts and medical educators from 8 different countries including Australia, Czech Republic, Japan, Malaysia, the Netherlands, Singapore, Thailand and the United States. The medical educators all had experience on collaborative learning such as problem based learning and team-based learning in health profession education curriculum.

Social interdependence theory, widely applied in educational psychology states that the transformation from self-interest to mutual interest when the outcomes of individuals are affected by their own and other's actions in a positive or negative social interdependence acts as an incentive for collaboration or not. When the common goal can only be achieved by the input of all members and not of just one player, positive social interdependence plays a role in collaborative efforts from all.

In order to execute the study, instrument development was conducted through building consensus among experts in a systematic manner and allowing for multiple feedback rounds and effective implementation. After the instrument development phase, validation of the instrument was performed through evaluating questionnaires. The group was successful in developing a scale for measuring social interdependence in collaborative learning by incorporating the opinions of international stakeholders.

Assistant professor Shimizu expresses, "in medical education, it is important to develop mutual reciprocal interdependence to aim for better practice while bringing out the best in each other's skills. I hope others will utilize this evaluation method in multi-disciplinary cooperative education. By using this evaluation method, active learning practices can hopefully be further improved."

Please read Measuring social interdependence in collaborative learning: instrument development and validation for more information.

Credit: 
Shinshu University

Minimizing thermal conductivity of crystalline material with optimal nanostructure

image: The optimum nanostructure designed with MI (aperiodic superlattice structure) was actually fabricated, and the optimal performance was verified by assessing its thermal conductivity. Figure: the Actual Structure is the electron microscope image of the fabricated sample. In addition, by further analyzing the phonon transport in the optimal structure, the mechanism that reduces thermal conductivity was clarified.

Image: 
The University of Tokyo

Professor Junichiro Shiomi et al. from The University of Tokyo aimed to reduce the thermal conductivity of semiconductor materials by reducing the internal nanostructure, and successfully minimized thermal conductivity by designing, fabricating, and evaluating the optimal nanostructure-multilayer materials through materials informatics (MI), which combines machine learning and molecular simulation. In 2017, this research group developed a method to design an optimal structure that minimizes or maximizes thermal conductivity via MI based on computational science. However, it has not been experimentally demonstrated, and preparation of nano-scale structures and realization of an optimal structure based on property measurements were desired.

Thus, the research group utilized a film deposition method able to regulate, at a molecular level, a superlattice structure wherein two materials were alternately layered at several nm thick, and a measurement method that could assess thermal conductivity of a film at nano-scale, and realized the optimal aperiodic superlattice structure that minimizes thermal conductivity. With the optimal structure, wave interference of the lattice vibration (phonon) that conducts heat was maximized, and thermal conductivity was strongly regulated.

In the present study, using the semiconductor lattice structure as the model, the research group verified the utility of the MI method in design, fabrication, assessment, and mechanism elucidation toward regulation of thermal conductivity. In the future, application of the MI method to various material systems is anticipated. It was also shown that optimization of the aperiodic structure can regulate thermal conductivity by fully controlling the wave property of a phonon at near room temperature. This is expected to contribute to developments in phonon engineering for instance in thermoelectric conversion devices, optical sensors, and gas sensors, where low thermal conductivity is needed while maintaining electric conductivity and mechanical properties.

Credit: 
Japan Science and Technology Agency

Artificial intelligence makes blurry faces look more than 60 times sharper

image: The system automatically increases any image's resolution up to 64x, 'imagining' features such as pores and eyelashes that weren't there in the first place.

Image: 
Rudin lab

DURHAM, N.C. -- Duke University researchers have developed an AI tool that can turn blurry, unrecognizable pictures of people's faces into eerily convincing computer-generated portraits, in finer detail than ever before.

Previous methods can scale an image of a face up to eight times its original resolution. But the Duke team has come up with a way to take a handful of pixels and create realistic-looking faces with up to 64 times the resolution, 'imagining' features such as fine lines, eyelashes and stubble that weren't there in the first place.

"Never have super-resolution images been created at this resolution before with this much detail," said Duke computer scientist Cynthia Rudin, who led the team.

The system cannot be used to identify people, the researchers say: It won't turn an out-of-focus, unrecognizable photo from a security camera into a crystal clear image of a real person. Rather, it is capable of generating new faces that don't exist, but look plausibly real.

While the researchers focused on faces as a proof of concept, the same technique could in theory take low-res shots of almost anything and create sharp, realistic-looking pictures, with applications ranging from medicine and microscopy to astronomy and satellite imagery, said co-author Sachit Menon '20, who just graduated from Duke with a double-major in mathematics and computer science.

The researchers will present their method, called PULSE, next week at the 2020 Conference on Computer Vision and Pattern Recognition (CVPR), held virtually from June 14 to June 19.

Traditional approaches take a low-resolution image and 'guess' what extra pixels are needed by trying to get them to match, on average, with corresponding pixels in high-resolution images the computer has seen before. As a result of this averaging, textured areas in hair and skin that might not line up perfectly from one pixel to the next end up looking fuzzy and indistinct.

The Duke team came up with a different approach. Instead of taking a low-resolution image and slowly adding new detail, the system scours AI-generated examples of high-resolution faces, searching for ones that look as much as possible like the input image when shrunk down to the same size.

The team used a tool in machine learning called a "generative adversarial network," or GAN, which are two neural networks trained on the same data set of photos. One network comes up with AI-created human faces that mimic the ones it was trained on, while the other takes this output and decides if it is convincing enough to be mistaken for the real thing. The first network gets better and better with experience, until the second network can't tell the difference.

PULSE can create realistic-looking images from noisy, poor-quality input that other methods can't, Rudin said. From a single blurred image of a face it can spit out any number of uncannily lifelike possibilities, each of which looks subtly like a different person.

Even given pixelated photos where the eyes and mouth are barely recognizable, "our algorithm still manages to do something with it, which is something that traditional approaches can't do," said co-author Alex Damian '20, a Duke math major.

The system can convert a 16x16-pixel image of a face to 1024 x 1024 pixels in a few seconds, adding more than a million pixels, akin to HD resolution. Details such as pores, wrinkles, and wisps of hair that are imperceptible in the low-res photos become crisp and clear in the computer-generated versions.

The researchers asked 40 people to rate 1,440 images generated via PULSE and five other scaling methods on a scale of one to five, and PULSE did the best, scoring almost as high as high-quality photos of actual people.

See the results and upload images for yourself at http://pulse.cs.duke.edu/.

Credit: 
Duke University

Together they stay alive longer

image: REM-image of an aerosol particle from mycobacterial associations.

Image: 
Elisabeth Pfrommer, Heinrich-Pette-Institute / FZ Borstel

Hamburg/Borstel/Leipzig. The tuberculosis pathogen Mycobacterium tuberculosis can protect itself better when combined and thus stay alive longer in the air. This was the result of a study by the Leibniz Research Alliance INFECTIONS, which was published in the scientific journal Scientific Reports on Monday.

The study examined the biophysical properties of tiny particles in the air (aerosols) that contribute to the spread of the pathogen. A successful human-to-human infection is determined, among other things, by the distance that the pathogen can travel through the air before the infectivity decreases. Conclusion: Although individual mycobacteria form smaller aerosols and can thus travel longer distances in the air, interconnected mycobacteria remain alive for longer. The study is based on earlier results that showed that mycobacteria-infected host cells die necrotic cell death, as occurs in the lungs of tuberculosis patients. It has now been shown that larger aerosol particles from mycobacterial clusters are produced together with components of the dead cells, which are more viable in the air than individual bacteria. Based on these data, computer simulations of airborne dispersal, which take into account the particle size distribution, can be carried out in the future, which will help to find out which aerosol composition may pose an increased risk of infection for humans.

The study was carried out at the Research Center Borstel, Leibniz Lung Center (FZB) in Schleswig-Holstein and the Heinrich Pette Institute (HPI), Leibniz Institute for Experimental Virology in Hamburg. The Leibniz Institute for Tropospheric Research (TROPOS) contributed its expertise in modelling the dispersion of aerosols such as mycobacterial associations floating in the air to the study.

Currently there is a controversial discussion about the importance of the aerosol dispersion of the SARS-CoV-2 virus for the COVID-19 pandemic. Findings on the aerosol spread of pathogens are therefore of particular interest. Whether parts of the new findings on the tuberculosis pathogen can be transferred to the COVID-19 pathogen is, however, currently completely open, since tuberculosis is transmitted by a bacterium that is significantly larger than the SARS-CoV-2 virus. Viruses are considered to be much more sensitive to environmental influences, as they depend on protection by moisture and dry out relatively quickly.

Credit: 
Leibniz Institute for Tropospheric Research (TROPOS)

COVID-19: Relationship between social media use and prejudice against Chinese Americans

The novel coronavirus SARS-CoV-2 that originated in China has claimed an estimated 100,000 lives in the United States, while a different sort of pandemic is spreading online against Asian Americans, particularly of Chinese descent. A study published in Frontiers in Communication suggests there is a strong relationship between social media use and prejudice.

The authors surveyed nearly 300 people in the United States on their attitudes about China and Chinese people in the wake of the pandemic. They found that "the more an individual believes their most used daily social media is fair, accurate, presents the facts, and is concerned about the public (social media belief), the more that person sees Chinese Americans as a realistic and symbolic threat."

Lead author Dr. Stephen Croucher, a professor of communication at Massey University in New Zealand whose research focuses on the dynamics between majority and minority groups, states that: "This was a big finding for us, as it shows the relationship between a pandemic, social media use and prejudice."

The online questionnaire of 277 white Americans gathered data on demographics, social media use, and various sentiments about Chinese people. The researchers analyzed the results within the framework of Integrated Threat Theory (ITT). ITT examines the components - realistic threats, symbolic threats, intergroup anxiety and negative stereotypes - that lead to prejudice between social groups.

Realistic threats, for example, represent fears related to economic or social power. A sample question on the survey assessing the degree of realistic threat included, ''Because of the presence of Chinese, unemployment will increase.'' Respondents then answered on a scale of one to five, from "strongly disagree" to "strongly agree."

Symbolic threats, on the other hand, relate to concerns about a group's "way of life." Intergroup anxiety refers to negative perceptions that arise from individual interactions between a member of the majority and a minority.

One key finding was that gender plays a significant role in predicting realistic and symbolic threats versus intergroup anxiety among Americans. Women tend to experience realistic or symbolic threats from Chinese Americans, while men experience higher levels of anxiety, according to the study.

"In this case, when faced with a crisis like a pandemic, it just makes sense that men would tend to respond more affectively while women would respond more cognitively - on average," Croucher said.

One head-scratching result from the study found that respondents who identified politically as a Democrat scored higher than Republicans on perceiving Chinese Americans as a symbolic threat.

"The result about political lines really was a surprising result," Croucher said, adding that it would be "really interesting" to further research how political leanings shift when a group is perceived as life threatening.

More than 1,700 incidents of harassment and assaults against Asian Americans have been reported since March 19, according to a website maintained by Asian Pacific Policy and Planning Council, San Francisco State University and Chinese for Affirmative Action.

Until the COVID-19 pandemic, anti-Asian hate crime has been on the decline for at least the past two decades, according to a report in The Washington Post, and the FBI has not reported any anti-Asian-motivated murders since at least 2003.

Croucher said that social media channels, like any media, can also be used effectively for spreading positive messages about Asian Americans. He and his co-authors proposed governments and healthcare industries use social media to combat COVID-19 prejudice.

"In the case of COVID-19, social media, and other media, were and are being used as venues to share and build ideas, values and morals," Croucher said. "Many of these are very positive, but some are not."

Credit: 
Frontiers

Clues to ageing come to light in vivid snapshots of brain cell links

image: Young mouse brain section showing lower synapse diversity. Credit Zhen Qui and Seth Grant University of Edinburgh.

Image: 
Zhen Qui and Seth Grant University of Edinburgh.

The colourful pictures of the whole mouse brain at different ages are the first of their kind and a pivotal step forward in understanding behaviour, scientists say.

Findings - published in the journal Science - could shed light on learning disability and dementia and help to reveal how memories are affected by age.

The images are of synapses - vital connections that carry electrical and chemical messages between brain cells. Synapses store memories and synapse damage is linked to more than 130 brain diseases.

Researchers based at the University of Edinburgh colour-coded the different types of molecules to highlight the range of synapses in mouse brains from birth to old age.

They discovered that the number and molecular makeup of synapses shifts with age in different parts of the brain. This happens at three main phases - childhood, middle and elderly age.

Synapse type shifts with age in patterns unique to areas of the brain, blossoming into a diverse array in midlife.

Images from middle-aged brains burst with colour, illustrating a wide variety of synapses. Both very young and very old brain show less synapses and less complexity.

Researchers say these changes give insights into why genes cause synapse damage at set ages and in set brain areas.

The findings could shed light on why we are more likely to develop brain conditions at certain ages, helping to explain why schizophrenia often starts in adolescence, or why dementia affects older adults.

The study was funded by Wellcome and the European Research Council.

Lead researcher, Professor Seth Grant of the Centre for Clinical Brain Sciences at the University of Edinburgh, said: "The brain is the most complex thing we know of and understanding it at this level of detail is a momentous step forward.

"We believe that these findings will be instrumental to helping understand why the brain is susceptible to disease at different times of life and how the brain changes as we age."

Credit: 
University of Edinburgh

How Dashcams help and hinder forensics

image: Dashcams are vital for helping police investigate car incidents, however the way the footage is submitted to police, managed and processed can cause problems. A researcher at WMG, University of Warwick has assessed seven different types of dashcams' SD storage systems to see how they help and hinder digital forensics.

Image: 
WMG, University of Warwick

Dashcams are an important in-car accessory that record car journeys, the footage from them is important for evidence if an incident has occurred.

However how the dashcam footage is submitted, managed and processed can be a problem for forensics investigating the case

The recording mode, GPS data, speed, license plate and temporal data of seven different devices has been assessed by researchers from WMG, University of Warwick

Dashcams are vital for helping police investigate car incidents, however the way the footage is submitted to police, managed and processed can cause problems. A researcher at WMG, University of Warwick has assessed seven different types of dashcams' SD storage systems to see how they help and hinder digital forensics.

Many cars now have dashcams, an in-vehicle mountable camera which records video and audio footage of journeys. They have significant evidential value in digital forensics as they provide GPS data, temporal data, vehicular speed data, audio, video and photographic images.

In the paper, 'Dashcam forensic: A preliminary analysis of 7 dashcam devices', published in the paper Forensic Science International: Digital Investigation, Dr Harjinder Lallie, from WMG, University of Warwick explores two aspects of dashcam evidence: the problems related to the management and processing of dashcam evidence, and an analysis of artefacts generated by dashcams.

The first dedicated UK dashcam evidence submission portal was established in 2018, called Nextbase, currently five police forces use Nextbase, whilst fourteen accept it to police sites, with seventeen more intending to active acceptance online and seven not accepting online submissions.

Seven different dashcams SD card systems were analysed for their:

Recording mode

GPS data

Vehicular speed data

License plate data

Temporal data

It was found that all of the artefacts above are available in several different locations: NMEA files, configuration files, directory naming structures, EXIF metadata, filename structures, file system attributes and watermarks.

A number of tools were required to extract the artefacts needed from the different locations in the SD card, and to analyse them. It was also found that evidential artefacts can be synthesised using tools such as native video players, therefore better methods are required for extracting and synthesising metadata from dashcams.

Dr Harjinder Lallie, from WMG, University of Warwick explains:

"We are increasingly reliant on the evidence produced by dashcam devices. However, there exist no standard guidelines on how to investigate dashcams and this can have an impact on judiciary process and the outcome thereof. This research is the first step towards developing such guidance."

Future research will look at

a. Formulating dashcam investigation guidelines for law enforcement.

b. Methods of automating the extraction of geospatial data and internally corroborating them.

c. Automating the extraction of evidence presented in watermarks in dashcam recorded videos.

Credit: 
University of Warwick

New distance measurements bolster challenge to basic model of universe

A new set of precision distance measurements made with an international collection of radio telescopes have greatly increased the likelihood that theorists need to revise the "standard model" that describes the fundamental nature of the Universe.

The new distance measurements allowed astronomers to refine their calculation of the Hubble Constant, the expansion rate of the Universe, a value important for testing the theoretical model describing the composition and evolution of the Universe. The problem is that the new measurements exacerbate a discrepancy between previously measured values of the Hubble Constant and the value predicted by the model when applied to measurements of the cosmic microwave background made by the Planck satellite.

"We find that galaxies are nearer than predicted by the standard model of cosmology, corroborating a problem identified in other types of distance measurements. There has been debate over whether this problem lies in the model itself or in the measurements used to test it. Our work uses a distance measurement technique completely independent of all others, and we reinforce the disparity between measured and predicted values. It is likely that the basic cosmological model involved in the predictions is the problem," said James Braatz, of the National Radio Astronomy Observatory (NRAO).

Braatz leads the Megamaser Cosmology Project, an international effort to measure the Hubble Constant by finding galaxies with specific properties that lend themselves to yielding precise geometric distances. The project has used the National Science Foundation's Very Long Baseline Array (VLBA), Karl G. Jansky Very Large Array (VLA), and Robert C. Byrd Green Bank Telescope (GBT), along with the Effelsberg telescope in Germany. The team reported their latest results in the Astrophysical Journal Letters.

Edwin Hubble, after whom the orbiting Hubble Space Telescope is named, first calculated the expansion rate of the universe (the Hubble Constant) in 1929 by measuring the distances to galaxies and their recession speeds. The more distant a galaxy is, the greater its recession speed from Earth. Today, the Hubble Constant remains a fundamental property of observational cosmology and a focus of many modern studies.

Measuring recession speeds of galaxies is relatively straightforward. Determining cosmic distances, however, has been a difficult task for astronomers. For objects in our own Milky Way Galaxy, astronomers can get distances by measuring the apparent shift in the object's position when viewed from opposite sides of Earth's orbit around the Sun, an effect called parallax. The first such measurement of a star's parallax distance came in 1838.

Beyond our own Galaxy, parallaxes are too small to measure, so astronomers have relied on objects called "standard candles," so named because their intrinsic brightness is presumed to be known. The distance to an object of known brightness can be calculated based on how dim the object appears from Earth. These standard candles include a class of stars called Cepheid variables and a specific type of stellar explosion called a Type Ia supernova.

Another method of estimating the expansion rate involves observing distant quasars whose light is bent by the gravitational effect of a foreground galaxy into multiple images. When the quasar varies in brightness, the change appears in the different images at different times. Measuring this time difference, along with calculations of the geometry of the light-bending, yields an estimate of the expansion rate.

Determinations of the Hubble Constant based on the standard candles and the gravitationally-lensed quasars have produced figures of 73-74 kilometers per second (the speed) per megaparsec (distance in units favored by astronomers).

However, predictions of the Hubble Constant from the standard cosmological model when applied to measurements of the cosmic microwave background (CMB) -- the leftover radiation from the Big Bang -- produce a value of 67.4, a significant and troubling difference. This difference, which astronomers say is beyond the experimental errors in the observations, has serious implications for the standard model.

The model is called Lambda Cold Dark Matter, or Lambda CDM, where "Lambda" refers to Einstein's cosmological constant and is a representation of dark energy. The model divides the composition of the Universe mainly between ordinary matter, dark matter, and dark energy, and describes how the Universe has evolved since the Big Bang.

The Megamaser Cosmology Project focuses on galaxies with disks of water-bearing molecular gas orbiting supermassive black holes at the galaxies' centers. If the orbiting disk is seen nearly edge-on from Earth, bright spots of radio emission, called masers -- radio analogs to visible-light lasers -- can be used to determine both the physical size of the disk and its angular extent, and therefore, through geometry, its distance. The project's team uses the worldwide collection of radio telescopes to make the precision measurements required for this technique.

In their latest work, the team refined their distance measurements to four galaxies, at distances ranging from 168 million light-years to 431 million light-years. Combined with previous distance measurements of two other galaxies, their calculations produced a value for the Hubble Constant of 73.9 kilometers per second per megaparsec.

"Testing the standard model of cosmology is a really challenging problem that requires the best-ever measurements of the Hubble Constant. The discrepancy between the predicted and measured values of the Hubble Constant points to one of the most fundamental problems in all of physics, so we would like to have multiple, independent measurements that corroborate the problem and test the model. Our method is geometric, and completely independent of all others, and it reinforces the discrepancy," said Dom Pesce, a researcher at the Center for Astrophysics | Harvard and Smithsonian, and lead author on the latest paper.

"The maser method of measuring the expansion rate of the universe is elegant, and, unlike the others, based on geometry. By measuring extremely precise positions and dynamics of maser spots in the accretion disk surrounding a distant black hole, we can determine the distance to the host galaxies and then the expansion rate. Our result from this unique technique strengthens the case for a key problem in observational cosmology." said Mark Reid of the Center for Astrophysics | Harvard and Smithsonian, and a member of the Megamaser Cosmology Project team.

"Our measurement of the Hubble Constant is very close to other recent measurements, and statistically very different from the predictions based on the CMB and the standard cosmological model. All indications are that the standard model needs revision," said Braatz.

Astronomers have various ways to adjust the model to resolve the discrepancy. Some of these include changing presumptions about the nature of dark energy, moving away from Einstein's cosmological constant. Others look at fundamental changes in particle physics, such as changing the numbers or types of neutrinos or the possibilities of interactions among them. There are other possibilities, even more exotic, and at the moment scientists have no clear evidence for discriminating among them.

"This is a classic case of the interplay between observation and theory. The Lambda CDM model has worked quite well for years, but now observations clearly are pointing to a problem that needs to be solved, and it appears the problem lies with the model," Pesce said.

Credit: 
National Radio Astronomy Observatory

COVID-19 triage decisions should 'ignore life-years saved,' writes bioethicist in Medical Care

June 11, 2020 - How do we decide which patients with COVID-19 should get priority for lifesaving ventilators and ICU beds? Writing in the July issue of Medical Care, a prominent bioethicist argues that COVID-19 triage strategies should focus on saving lives, rather than prioritizing life-years saved. Medical Care is published in the Lippincott portfolio by Wolters Kluwer.

"Justice supports triage priority for those with better initial survival prognosis, but opposes considering subsequent life-years saved," according to a special editorial by John R. Stone, MD, PhD, Professor of Bioethics and Co-Founder and Co-Executive Director of the Center for Promoting Health and Health Equity at Creighton University, Omaha. He adds: "Groups experiencing historical and current inequities must have significant voices in determining triage policy."

'Justice-Respect-Worth' Framework Calls for Rethinking COVID-19 Triage

Recent articles have proposed frameworks for making the "terrible choices" posed by COVID-19 - focused on maximizing the benefits of treatment based on life-years saved. In one approach, patients with lower "prognosis scores" get lower priority for critical care.

But the focus on counting life-years violates "the foundational moral framework of social justice, respect for persons, and people's equal and substantial moral worth," Dr. Stone writes. In particular, prioritizing treatment for patients with a better prognosis will give lower priority, on average, "to individuals for whom social/structural inequities are significant causes of worse health" - with racial/ethnic minorities being a key example.

"Historical and present inequities have reduced expected life-years in populations experiencing chronic disadvantage," according to the author. "Justice requires avoiding policies that further increase inequities...greater priority for more predicted life-years saved will exacerbate those inequities."

A more just approach would be to consider the individual's likelihood of initial survival, while ignoring subsequent life-years saved. "Triage policies can reasonably give priority to people more likely to survive hospitalization and a brief time after," Dr. Stone writes.

By this approach, a younger and older patient would have similar priority for lifesaving care- as long as they had a similar chance of surviving for more than a few months after leaving the hospital. (Dr. Stone adds that bias against the elderly is another reason not to prioritize life-years gained.)

While guidance for triage decisions tries to ensure objectivity, assessments may still be affected by implicit and unconscious negative bias. For that reason, specific diversity on triage teams is essential. Policy decision-makers must include representatives of "populations historically oppressed and disadvantaged," according to the author.

Dr. Stone highlights the importance of the "justice-worth-respect" framework in making difficult decisions about which patients should be prioritized for scare healthcare resources. He concludes: "Triage policies focused on life-years saved will perpetuate social injustice and generally should be rejected."

Credit: 
Wolters Kluwer Health