Tech

Researchers question fundamental study on the Kondo effect

image: Illustration showing the atomic tip of a scanning tunnelling microscope while probing a metal surface with a cobalt atom positioned on top. A characteristic dip in the measurement results is found on surfaces made of copper as well as of silver and gold.

Image: 
Forschungszentrum Jülich

Jülich, Germany, 7 January 2021. The Kondo effect influences the electrical resistance of metals at low temperatures and generates complex electronic and magnetic orders. Novel concepts for data storage and processing, such as using quantum dots, are based on this. In 1998, researchers from the United States published spectroscopic studies on the Kondo effect using scanning tunnelling microscopy, which are considered ground-breaking and have triggered countless others of a similar kind. Many of these studies may have to be re-examined now that Jülich researchers have shown that the Kondo effect cannot be proven beyond doubt by this method. Instead, another phenomenon is creating precisely the spectroscopic "fingerprint" that was previously attributed to the Kondo effect.

Normally the resistance of metals decreases as the temperature drops. The Kondo effect causes it to rise again below a threshold value typical to the material in question, the so-called Kondo temperature. This phenomenon occurs when magnetic foreign atoms, such as iron, contaminate non-magnetic host metals, such as copper. Simply put, when a current flows, the atomic nuclei are engulfed by electrons. The iron atoms have a quantum mechanical magnetic moment. This causes the electrons in the vicinity to align their spin antiparallel to the moment of the atom at low temperatures and to hang around the cobalt atom like a cloud on a mountaintop. This hinders the flow of the electrons - the electrical resistance then increases. In physics, this is known as entanglement, the strong coupling of the moment of the impurity with the spins of the surrounding electrons. This effect can be exploited, for example in the form of quantum dots: nanocrystals that could one day serve as miniscule information storage or processor elements.

The Kondo effect had already been observed in 1934 and was fundamentally explained by Jun Kondo in 1964. In 1998, experimental physicists achieved a methodological breakthrough in the study of the effect. By means of scanning tunnelling microscopy, it had become possible to detect and position individual atoms on surfaces and to record energy spectra specifically at these points. A characteristic dip in the measurement curve was found at the position of cobalt atoms on a gold surface, which from then on was considered the marker for the Kondo effect. Previously, the Kondo effect could only be detected indirectly via resistance measurements. Further investigations of other material combinations and atomic arrangements using this technique followed on as a result, and a separate field of research was created, dedicated to the investigation of many-body phenomena with atomic resolution.

However, the physicists from the Peter Grünberg Institute and the Institute for Advanced Simulation at Forschungszentrum Jülich have now found an alternative cause for the dip in the energy spectrum: so-called magnetic anisotropy. Below a specific temperature, this causes the magnetic moment of the foreign atom to couple to the crystal lattice of the host metal, so that the orientation of the moment virtually "freezes". Above this temperature, excitations of the magnetic moment occur due to the spin properties of the tunnelling electrons of the microscope. Scientists were not yet able to measure this type of spin excitation in 1998.

The researchers have been working for years to improve theoretical models for spin excitation. Early on they found evidence of the Kondo-like marker. Initially, however, they still lacked the ability to consistently include important, so-called relativistic effects in their calculations. Once they had succeeded in doing so, they took another look at the system of cobalt and gold. They were now able to back up their calculations impressively with data from scanning tunnelling spectroscopy studies. Both the measured and calculated spectra are approximately in agreement.

"This means that much of what we thought we had learned about the Kondo effect over the last two decades, and which has already found its way into textbooks, needs to be re-examined," explains Prof. Samir Lounis, head of the Functional Nanoscale Structure Probe and Simulation Laboratory (Funsilab). The scientists are already proposing the first new experiments based on their predictions.

Credit: 
Forschungszentrum Juelich

School nutrition professionals' employee safety experiences during onset of the COVID-19 pandemic

audio: Emily Vaterlaus Patten, PhD, RDN, CD, talks about a new study that explores real-time personal and employee safety experiences and perspectives of school nutrition professionals ranging from frontline staff to state leadership across the US during the early weeks of the coronavirus pandemic. “They’re veritable heroes. They’ve worked and self-sacrificed to mitigate food insecurity in their communities during this difficult time.”

Image: 
Journal of Nutrition Education and Behavior

Philadelphia, January 7, 2021 - A new study in the Journal of Nutrition Education and Behavior, published by Elsevier, explores real-time personal and employee safety experiences and perspectives of school nutrition professionals ranging from frontline staff to state leadership across the United States during the early weeks of the coronavirus pandemic.

A survey with both closed- and open-ended items was developed to explore the experiences of school nutrition staff, managers, directors, and state agency personnel. Descriptive statistics from the responses of 47 states were calculated, and a thematic analysis of an open-ended item was conducted.

"When we analyzed the survey's open-ended results, we found three themes in the responses. The most prominent theme, that 94 percent of our respondents reported, was that they had significant concern for risk of transmission or risk of exposure to the coronavirus while they were at work as they interacted with people in the community and their coworkers," said lead investigator Emily Vaterlaus Patten, PhD, RDN, CD, Brigham Young University, Provo, UT, USA.

Another theme that emerged was concern with the processes including the administrative interactions, logistics, and protocols for serving emergency meals. For some, there was a feeling of distrust or lack of support from authority. School districts and state health departments were described as disconnected from the situation, apathetic, or uncaring. "I don't trust that a school nurse understands and/or even cares about whether or not we are safe. They are working from home and have no clue what we are up against," wrote a nutrition program director from Texas.

Personal concerns were also a common theme in the research results. Respondents described feelings of exhaustion, fear, and stress. Personal concerns were often rooted in the perspective that working through the pandemic was a high risk, low-reward situation. Others were concerned for themselves, their families, peers, or employees because of the risk factors associated with severe illness.

"This was and continues to be a challenging time for school nutrition professionals and this research highlights that they have demonstrated resilience through the whole process. USDA Waivers have helped improve conditions, and nutrition programs have innovated their distribution methods. I think we are celebrating the enormous number of meals that have been served during this time and the people that made it happen. They're veritable heroes. They've worked and self-sacrificed to mitigate food insecurity in their communities during this difficult time," Dr. Vaterlaus Patten said.

Credit: 
Elsevier

Chemists invent shape-shifting nanomaterial with biomedical potential

image: Fluorescent micrograph, above, shows the new nanomaterial in sheet form. The white scale bar is 4 micrometers in the main photo and 2 micrometers in the inset photo.

Image: 
Conticello Lab

Chemists have developed a nanomaterial that they can trigger to shape shift -- from flat sheets to tubes and back to sheets again -- in a controllable fashion. The Journal of the American Chemical Society published a description of the nanomaterial, which was developed at Emory University and holds potential for a range of biomedical applications, from controlled-release drug delivery to tissue engineering.

The nanomaterial, which in sheet form is 10,000 times thinner than the width of a human hair, is made of synthetic collagen. Naturally occurring collagen is the most abundant protein in humans, making the new material intrinsically biocompatible.

"No one has previously made collagen with the shape-shifting properties of our nanomaterial," says Vincent Conticello, senior author of the finding and Emory professor of biomolecular chemistry. "We can convert it from sheets to tubes and back simply by varying the pH, or acid concentration, in its environment."

The Emory Office of Technology Transfer has applied for a provisional patent for the nanomaterial.

First authors of the finding are Andrea Merg, a former post-doctoral fellow in the Conticello lab who is now at the University of California Merced, and Gavin Touponse, who did the work as an Emory undergraduate and is now in medical school at Stanford. The work was a collaboration between Emory and scientists from the Argonne National Laboratory, the Paul Scherrer Institute in Villigen, Switzerland, and the Center for Cellular Imaging and NanoAnalytics at the University of Basel.

Collagen is the main structural protein in the body's connective tissue, such as cartilage, bones, tendons, ligaments and skin. It is also abundant in blood vessels, the gut, muscles and in other parts of the body.

Collagen taken from other mammals, such as pigs, is sometimes used for wound healing and other medical applications in humans. Conticello's lab is one of only about a few dozen around the world focused on developing synthetic collagen suitable for applications in biomedicine and other complex technologies. Such synthetic "designer" biomaterials can be controlled in ways that natural collagen cannot.

"As far back as 30 years ago, it became possible to control the sequence of collagen," Conticello says. "The field has really picked up steam, however, during the past 15 years due to advances in crystallography and electron microscopy, which allows us to better analyze structures at the nano-scale."

The development of the new shape-shifting nanomaterial at Emory was "a fortuitous accident," Conticello says. "There was an element of luck to it and an element of design."

The collagen protein is composed of a triple helix of fibers that wrap around one another like a three-stranded rope. The strands are not flexible, they're stiff like pencils, and they pack together tightly in a crystalline array.

The Conticello lab has been working with collagen sheets that it developed for a decade. "A sheet is one large, two-dimensional crystal, but because of the way the peptides pack it's like a whole bunch of pencils bundled together," Conticello explains. "Half the pencils in the bundle have their leads pointing up and the other half have their eraser-end pointing up."

Conticello wanted to try to refine the collagen sheets so that each side would be limited to one functionality. To take the pencil analogy further, one surface of the sheet would be all lead points and the other surface would be all erasers. The ultimate goal was to develop collagen sheets that could be integrated with a medical device by making one surface compatible with the device and the other surface compatible with functional proteins in the body.

When the researchers engineered these separate types of surfaces into single collagen sheets, however, they were surprised to learn that it caused the sheets to curl up like scrolls. They then found that the shape-shifting transition was reversible -- they could control whether a sheet was flat or scrolled simply by changing the pH of the solution it was in. They also demonstrated that they could tune the sheets to shape shift at particular pH levels in a way that could be controlled at the molecular level through design.

"It's particularly interesting that the condition around which the transition occurs is a physiological condition," Conticello says. "That opens the potential to find a way to load a therapeutic into a collagen tube under controlled, laboratory conditions. The collagen tube could then be tuned to unfurl and release the drug molecules it contains after it enters the pH environment of a human cell."

Emory scientists who contributed to measuring and characterizing the new nanomaterial and co-authored the paper include chemistry professors Brian Dyer and Khalid Salaita; chemistry graduate students Alisina Bazrafshan and Helen Siaw; and Arthur McCanna from the Robert P. Apkarian Integrated Electron Microscopy Core.

Credit: 
Emory Health Sciences

MRI frequently underestimates tumor size in prostate cancer

FINDINGS

A study led by researchers at the UCLA Jonsson Comprehensive Cancer Center has found that magnetic resonance imaging, or MRI, frequently underestimates the size of prostate tumors, potentially leading to undertreatment.

The study authors found that such underestimation occurs most often when the MRI-measured tumor size is small and the PI-RADS score, which is used to classify lesions in prostate MRI analysis, is low.

For prostate tumor treatments to be successful, both the MRI size measurement and PI-RADS score must be accurate because they allow physicians to determine precisely where tumors end and where the normal, healthy tissue surrounding them begins.

BACKGROUND

MRI is frequently used to diagnose and manage prostate cancer. It is also increasingly used as a means to map and guide delivery of new, highly focused therapies that use freezing (cryotherapy), ultrasound (HIFU) and heat (laser ablation) to destroy cancerous tissue in the prostate gland while sparing healthy tissue.

METHOD

Researchers compared MRI-measured tumor size with actual tumor size after prostate removal in 441 men treated for prostate cancer.

IMPACT

Improving the ability to better predict ablation margins will allow for more successful treatments for men with prostate cancer and can help reduce the morbidity of prostate cancer treatment.

Credit: 
University of California - Los Angeles Health Sciences

Insights into the Yellowstone hotspot

image: The break in slope is coincident with a discontinuity representing a regional three- to five-million-year hiatus separating two distinct styles of volcanic activity in the Cascadia backarc region. The lower slope is composed of high-K calc-alkaline lavas derived from melting of a hydrated mantle source from 30 to 20 Ma. The steeper upper slope is composed of thin flows of the tholeiitic Steens Basalt (early lavas of the Columbia River Basalt Group) derived from melting of a dry mantle source with a plume component at ca. 17 Ma.

Image: 
Victor Camp and Ray Wells

Boulder, Colo., USA: The Yellowstone hotspot is well known for generating supereruptions in the geologic past that are far more explosive than historic examples. The origin and sustained longevity of the hotspot is less understood but is focused on two competing models, where the ascent of hot mantle is derived from either a deep-seated mantle plume or a shallow mantle source.

In their study published this month in GSA Today, Vic Camp and Ray Wells use an integrated database that supports the idea of a deep mantle-plume origin for the Yellowstone hotspot with a robust history of magmatism that extends to at least 56 million years ago, far older than previously thought. In this scenario, hotspot volcanism began offshore and migrated to the east-northeast across northeastern California, northern Nevada, southeastern Oregon, and southern Idaho to its current position at Yellowstone National Park.

This long-lived path of hotspot migration is marked by a belt of aligned volcanic provinces that display progressively younger ages to the east-northeast, similar to the age progression produced by southwest motion of the North America plate over a fixed Yellowstone hotspot.

Credit: 
Geological Society of America

Energy sorghum may combine best of annual, perennial bioenergy crops

image: CABBI researcher Caitlin Moore, standing here in an Illinois sorghum field, was a lead author on a study that found annual energy sorghum behaved more like miscanthus than maize in the way it captured light and used water to produce abundant biomass. The work signals the plant's potential as a sustainable bioenergy crop.

Image: 
- Center for Advanced Bioenergy and Biofuels Innovation (CABBI)

Large perennial grasses like miscanthus are a primary target for use as bioenergy crops because of their sustainability advantages, but they take several years to establish and aren't ideal for crop rotation. Maize and other annual crops are easier to manage with traditional farming but are tougher on the environment.

Energy sorghum, a hefty annual plant with the ecological benefits of a perennial, may combine the best of both crops.

A study by researchers at the U.S. Department of Energy (DOE) Center for Advanced Bioenergy and Bioproducts Innovation (CABBI) found that energy sorghum (Sorghum bicolor) behaves more like miscanthus in the way it efficiently captures light and uses water to produce abundant biomass. It has higher nitrogen emissions like maize, but researchers believe careful fertilizer management could reduce those levels.

The study, published in Global Change Biology: Bioenergy, offers an important first look at how energy sorghum compares to maize and miscanthus grown in the Midwest, providing critical data for biogeochemical and ecological models used to forecast crop growth, productivity, and sustainability. It was led by former CABBI Postdoctoral Researcher Caitlin Moore (pictured) and her advisor, Carl Bernacchi, Plant Physiologist with the U.S. Department of Agriculture's Agricultural Research Service and Adjunct Professor of Plant Biology and Crop Sciences at the University of Illinois Urbana-Champaign (UIUC).

Sorghum appears to be a "middle-road crop," with an annual growth cycle but the ability to use much less water than maize to produce "a ton" of biomass, said Moore, now a Research Fellow at the University of Western Australia's School of Agriculture and Environment. "It certainly holds promise as a crop that supports the bioenergy economy."

The researchers conducted ecosystem-scale comparisons of carbon, nitrogen, water, and energy fluxes of Sorghum bicolor with maize and Miscanthus x. giganteus at the UIUC Energy Farm during the 2018 growing season, a near-average year in terms of temperature, rainfall, and soil moisture. The fluxes reflect "the breathing of the ecosystem" — how water, carbon dioxide (CO2), nitrogen (N), and energy move between plants and the atmosphere, Moore said.

The long-term ecological sustainability of bioenergy crops depends on how well the system "breathes." An ideal crop would use water and light efficiently to maximize the amount of biomass, keep carbon in the soil instead of releasing it into the atmosphere, and require little nitrogen fertilizer, which can leach into water or react with soil to produce nitrous oxide (N2O), a potent greenhouse gas.

Miscanthus and other large perennials offer the best option for biomass production and carbon sequestration, as they feature extensive underground systems for storing carbon and nitrogen and require less fertilizer than annual crops. But those benefits can be negated if the carbon and nitrogen are disturbed as the land is converted to other uses, as with crop rotation. That could make it difficult for land managers to rotate between food and fuel crops to keep up with market demands while still maximizing long-term carbon storage.

Maize, on the other hand, is highly productive but requires a great deal of water and nitrogen, and it loses carbon stored in its ecosystem through harvesting and tilling.

Energy sorghum falls somewhere in between. As an annual, it can be easily rotated with other crops like soybeans and maize. It's photoperiod-sensitive, so it produces generous yields of biomass late into the season when grown in regions with long days. And because it is drought-tolerant, energy sorghum can be grown in low-rainfall regions, alleviating the pressure a growing biofuel industry could place on existing land used for food production.

The question, before this study, was whether energy sorghum behaved more like miscanthus or maize, and what that meant for the ecosystem. The CABBI team used eddy covariance flux towers and below-ground soil readings to record data on the plants' ecosystems — everything from wind speed and turbulence to air temperatures, atmospheric gases, humidity, and soil moisture. They also measured energy exchange by looking at "albedo," a measure of surface reflectivity. In the heat of summer, plants evapotranspire, which releases moisture and excess energy into the air; leaves with high reflectivity don't take on as much heat and energy.

Researchers found that during the peak growing season in July — when all three crops were at their maximum productive potential — carbon, water, and energy fluxes from the sorghum ecosystem were more similar to those from the miscanthus ecosystem, whereas its nitrogen fluxes more closely resembled maize.

Maize had the highest productivity but also the highest evapotranspiration, with energy sorghum and miscanthus more closely aligned. Overall, energy sorghum had the highest water use efficiency. Maize was the most efficient at turning light into biomass through photosynthesis, with miscanthus on the low end and sorghum in the middle — probably because the latter two plants have leafy, dense canopies while maize is grown in rows to maximize light penetration, Moore said.

The N2O flux was higher from maize and energy sorghum compared to miscanthus. The researchers concluded that they likely used too much fertilizer on the sorghum, even though it received half the amount as maize. June rains that waterlogged the low-lying sorghum fields might have exacerbated nitrogen losses. The scientists suspect that sorghum has a much lower need for nitrogen, akin to miscanthus.

Continued observations of fluxes for the three crops will be an important next step for comparing their responses to drought, floods, and other extreme climate events and assessing year-to-year biogeochemical differences. A detailed understanding of the interaction between crop type, climate, and management will be critical for forecasting the long-term sustainability of these key bioenergy crops, which will play an important role in ensuring the U.S. meets the 2050 cellulosic bioenergy requirements mandated in the Energy Independence and Security Act of 2007, the study said.

Credit: 
University of Illinois at Urbana-Champaign Institute for Sustainability, Energy, and Environment

Seafood strategies

The "Executive Order on Promoting American Seafood Competitiveness and Economic Growth," issued by the Trump administration in May 2020, lays out a plan to expand the U.S. seafood industry, especially aquaculture, and enhance American seafood competitiveness in the global market.

The goals of the directive are focused largely on growth and expansion of the industry, which includes wild-caught fisheries and farm-raised products, as well as recreation, processing and other industries that rely on fishing.

"The seafood industry in general is worth about $200 billion and accounts for 2 million jobs in the United States," said Halley E. Froehlich, a professor of fisheries and aquaculture at UC Santa Barbara, who with her colleagues finds that the executive order "ends up being a complicated and opaque ask," given the complexity of the seafood industry and the headwinds it has been experiencing of late.

"We started having some deep conversations about policy implications and what they meant relative to some massive disruptions," she said.

The seafood industry had already been fighting for stability in light of a two-year trade war with China -- the world's largest seafood consumer -- and by the time the executive order to expand was issued, U.S. seafood was in a historic freefall due to COVID-19. Meanwhile, slow pandemic relief funding and upcoming changes in the White House add another layer of uncertainty to the future of the seafood industry and its sustainable expansion.

However, it is possible to map out a sustainable means for growing the seafood industry, according to the researchers. Looking through the lens of the executive order, they outlined several guiding principles to bridge the current state of U.S. seafood and the desired outcomes of the federal directive.

Their recommendations are published in the journal Marine Policy.

Strategy and Data

Americans have a huge appetite for seafood. The U.S. is the world's largest net seafood importer, with a growing dependence on imports. According to the study, 85% of U.S. domestic wild-caught stocks are already fished at or near maximum sustainable levels.

Farmed seafood, on the other hand, makes up only about 8% of domestic seafood production and holds great potential. But it doesn't have the same level of coordinated monitoring and reporting as the much larger, wild fisheries sector. Questions remain about the type and location of these new farms -- where they won't interfere with wild fisheries or encounter opposition from the local community.

To address these and other complexities, the researchers call for "precise and strategic fisheries reforms" in response to the order's move to "reduce burdens on domestic fishing and to increase production."

"We wanted to articulate the things that could be done, and things that may actually be counterproductive," Froehlich said. For instance, removing regulations on a seafood sector that is already at its maximum sustainable capacity would likely not produce the effect sought by the order.

"We're pretty much at max for our wild fisheries," she said. According to the study there are smaller, more targeted measures that might increase profitability, though it is still unlikely that production would increase substantially.

Integrating wild fisheries and aquaculture sectors would also be beneficial, according to the researchers.

"The two systems are largely managed independent of each other," Froehlich said, "and this results in linkages that exist, but aren't necessarily accounted for." By employing an ecosystems approach, and collecting and releasing comprehensive aquaculture data, it would be possible to monitor the growth of aquaculture while reducing the potential of negatively impacting wild fisheries in domestic waters.

The growth of sustainable aquaculture and the future success of the U.S. seafood industry will also require some changes in perception, both local and global. As a means of food production, aquaculture is bound to have some sort of impact on the local environment and the local community, Froehlich said, making stakeholder input, transparency and access to information essential elements to address the resistance these seafood farms may encounter. For now, the U.S. largely displaces the production responsibility to other countries.

Meanwhile, the researchers have their eye on the incoming administration with the mounting bipartisan support for different kinds of aquaculture in the U.S. and possible changes to the protectionist stance the U.S. has taken on global trade. The seafood industry was cut deep when trade relations with China deteriorated, a struggle that only worsened with the COVID pandemic.

For U.S. fisheries and aquaculture, the researchers see a chance for the sector to not just recover, but build a better system that can withstand other shocks in the future, including climate change. "The pandemic has highlighted the weak points in the sector," Froehlich said, "so the chance to improve should not be overlooked."

Credit: 
University of California - Santa Barbara

Study: e-cigarettes trigger inflammation in the gut

image: In the bottom frames, burst cell junctions in the gut lining can be seen after being exposed to e-cigarette chemicals as compared to healthy cells in the top frames.

Image: 
HUMANOID Center of Research Excellence

Touted by makers as a "healthy" alternative to traditional nicotine cigarettes, new research indicates the chemicals found in e-cigarettes disrupt the gut barrier and trigger inflammation in the body, potentially leading to a variety of health concerns.

In the study, published Jan. 5, 2021 in the journal iScience, Soumita Das, PhD, associate professor of pathology, and Pradipta Ghosh, MD, professor of cellular and molecular medicine at UC San Diego School of Medicine and Moores Cancer Center at UC San Diego School of Medicine, with colleagues, found that chronic use of nicotine-free e-cigarettes led to a "leaky gut," in which microbes and other molecules seep out of the intestines, resulting in chronic inflammation. Such inflammation can contribute to a variety of diseases and conditions, including inflammatory bowel disease, dementia, certain cancers, atherosclerosis, liver fibrosis, diabetes and arthritis.

"The gut lining is an amazing entity. It is comprised of a single layer of cells that are meant to seal the body from the trillions of microbes, defend our immune system, and at the same time allow absorption of essential nutrients," said Ghosh. "Anything we eat or drink, our lifestyle choices in other words, has the ability to impact our gut microbes, the gut barrier and overall health. Now we know that what we smoke, such as e-cigarettes, negatively impacts it as well."

The researchers found that two chemicals used as a base for all e-cigarette liquid vapor -- propylene glycol and vegetable glycerol -- were the cause of inflammation.

"Numerous chemicals are created when these two are heated to generate the fumes in vaping that cause the most damage, for which there are no current regulations," said Ghosh. "The safety of e-cigarettes have been debated fiercely on both sides. Nicotine content, and its addictive nature, has always been the major focus of those who argue against its safety, whereas lack of chemicals in the carcinogens that are present in the cigarette smoke has been touted by the makers of e-cigarettes when marketing these products as a 'healthy alternative.' In reality, it's the chemicals making up the vapor liquid that we should be more concerned about as they are the cause of gut inflammation."

For the study, the team used 3D models of human intestinal tracts generated from patient cells and simulated what happens when e-cigarette vapors enter the gut lining. Researchers validated the findings using mice models of vaping in collaboration with Laura Crotty-Alexander, MD, associate professor of medicine in the Division of Pulmonary, Critical Care and Sleep Medicine at UC San Diego School of Medicine and section chief of Pulmonary Critical Care at Veterans Affairs San Diego Healthcare System.

To produce the 3D gut organoids, the researchers collected stem cells from patients' biopsies during colonoscopies and grew them in vitro. The stem cells differentiated into the four different cell types that make up the gut lining. The team then exposed the organoids to e-cigarette liquid vapor, mimicking the frequency of a chronic vaper.

They noted that epithelial tight conjunction markers, which are zipper-like proteins that form the gut's first physical barrier, began to break or loosen, causing pathogens from the vapor to seep into the surrounding immune system, wreaking havoc on protective epithelial cells that lie just beneath.

Such cells act as a defense against infection by clearing pathogenic microbes and initiating certain immune responses in the body. When exposed to the e-cigarette liquid, the cells were quickly overwhelmed, unable to effectively clear pathogens, resulting in gut inflammation.

The study is part of the HUMANOID Center of Research Excellence, a core facility based at UC San Diego School of Medicine led by Ghosh and Das who was senior author of the study. Scientists at the center use a variety of human organoids and other tools to model diseases and effects.

"This is the first study that demonstrates how chronic exposure to e-cigarettes increases the gut's susceptibility to bacterial infections, leading to chronic inflammation and other health concerns," said Das. "Given the importance of the gut barrier in the maintenance of the body's immune homeostasis, the findings provide valuable insight into the potential long-term harmful effects chronic use of e-cigarettes on our health."

Ghosh said damage to the gut lining may be reversible over time if the inciting factor, in this case e-cigarette use, is eliminated, but the effects of chronic inflammation upon other organs, such as the heart or brain, may be irreversible. In the future, Ghosh said she and colleagues plan to look at different flavorings of e-cigarettes to determine what effects they might have on the gut.

Credit: 
University of California - San Diego

Oldest hominins of Olduvai Gorge persisted across changing environments

image: Olduvai (now Oldupai) Gorge, known as the Cradle of Humankind, is a UNESCO World Heritage site in Tanzania. New interdisciplinary field work has led to the discovery of the oldest archaeological site in Oldupai Gorge, which shows that early human used a wide diversity of habitats amidst environmental changes across a 200,000 year-long period.

Image: 
Michael Petraglia

Olduvai (now Oldupai) Gorge, known as the Cradle of Humankind, is a UNESCO World Heritage site in Tanzania, made famous by Louis and Mary Leakey. New interdisciplinary field work has led to the discovery of the oldest archaeological site in Oldupai Gorge as reported in Nature Communications, which shows that early human used a wide diversity of habitats amidst environmental changes across a 200,000 year-long period.

Located in the heart of eastern Africa, the Rift System is a prime region for human origins research, boasting extraordinary records of extinct human species and environmental records spanning several million years. For more than a century, archaeologists and human palaeontologists have been exploring the East African Rift outcrops and unearthing hominin fossils in surveys and excavations. However, understanding of the environmental contexts in which these hominins lived has remained elusive due to a dearth of ecological studies in direct association with the cultural remains.

In the new study, published in Nature Communications, researchers from the Max Planck Institute for for the Science of Human History teamed up with lead partners from the University of Calgary, Canada, and the University of Dar es Salaam, Tanzania, to excavate the site of 'Ewass Oldupa' (meaning on 'the way to the Gorge' in the local Maa language, as the site straddles the path that links the canyon's rim with its bottom). The excavations uncovered the oldest Oldowan stone tools ever found at Oldupai Gorge, dating to ~2 million years ago. Excavations in long sequences of stratified sediments and dated volcanic horizons indicated hominin presence at Ewass Oldupai from 2.0 to 1.8 million years ago.

Fossils of mammals (wild cattle and pigs, hippos, panthers, lions, hyena, primates), reptiles and birds, together with a range of multidisciplinary scientific studies, revealed habitat changes over 200,000 years in riverine and lake systems, including fern meadows, woodland mosaics, naturally burned landscapes, lakeside palm groves and dry steppe habitats. The uncovered evidence shows periodic but recurrent land use across a subset of environments, punctuated with times when there is an absence of hominin activity.

Dr. Pastory Bushozi of Dar es Salaam University, Tanzania, notes, "the occupation of varied and unstable environments, including after volcanic activity, is one of the earliest examples of adaptation to major ecological transformations."

Hominin occupation of fluctuating and disturbed environments is unique for this early time period and shows complex behavioural adaptations among early human groups. In the face of changing habitats, early humans did not substantially alter their toolkits, but instead their technology remained stable over time. Indicative of their versatility, typical Oldowan stone tools, consisting of pebble and cobble cores and sharp-edged flakes and polyhedral cobbles, continued to be used even as habitats changed. The implication is that by two million years ago, early humans had the behavioural capacity to continually and consistently exploit a multitude of habitats, using reliable stone toolkits, to likely process plants and butcher animals over the long term.

Though no hominin fossils have yet been recovered from Ewass Oldupa, hominin fossils of Homo habilis were found just 350 metres away, in deposits dating to 1.82 million years ago. While it is difficult to know if Homo habilis was present at Ewass Oldupa, Professor Julio Mercader of the University of Calgary asserts that "these early humans were surely ranging widely over the landscape and along shores of the ancient lake." Mercader further notes that this does not discount the possibility that other hominin species, such as the australopithecines, were also using and making stone tools at Ewass Oldupa, as we know that the genus Paranthropus was present in Oldupai Gorge at this time.

Credit: 
Max Planck Institute of Geoanthropology

Researchers repurpose 'damaged' polymer optical fibers to precisely measure magnetic fields

image: A magnetic field sensor based on multimodal interference in a "fused" polymer optical fiber achieves an ultrahigh sensitivity of 113.5 pm/mT.

Image: 
Yokohama National University

The invention of optical fibers has revolutionized not only telecommunications but also sensing technology. Optical fiber sensors can measure strain, temperature, pressure, and many other physical parameters along the fibers, but they are currently immune to electromagnetic noise -- interference from other external electric or magnetic interactions. It is a desirable trait, until the effect of the electromagnetic field on the fibers needs to be measured. Now, an international team of researchers has used what was previously considered a 'damaged' part of an optical fiber to develop such a magnetic field sensor.

They published details of their approach on Nov. 5 in Advanced Photonics Research.

"This nature of immunity to electromagnetic noise is a great merit when we measure strain, temperature, etc., under strong electromagnetic field environments," said paper co-author Yosuke Mizuno, associate professor at the Faculty of Engineering, Yokohama National University. "However, it simultaneously means that electromagnetic field sensing using optical fibers is a major challenge, which we tackled in this paper."

The researchers took advantage of a 'fiber fuse' effect, which is induced when high-power light is injected into an optical fiber with tight bends, bad connectors, and other non-ideal conditions. When high-power light is injected into such a compromised optical fiber, the optical energy is 'trapped' in the core of the fiber, generating an optical discharge that propagates toward the light source, permanently damaging the fiber in the process. The research team has found that, when the fiber is made of polymer, this effect results in an electrically conductive carbonized path, which can in turn enable the interactions needed to react to magnetic fields.

"The interactions between the magnetic field and the carbonized - or 'damaged' - regions can lead to variations in optical parameters in the fiber," said lead author Arnaldo Leal-Junior, professor in the Graduate Program in Electrical Engineering, Federal University of Espírito Santo. "By sandwiching a fused polymer fiber between two silica single-mode fibers and inducing what we call multimodal interference, a fiber-optic magnetic field sensor can be implemented."

The researchers experimentally showed that this sensor can detect a small magnetic field change of 45 microtesla, which is several hundreds of times smaller - or more sensitive - than the 20 millitesla detection by a conventional fiber-optic method. For comparison, a magnetic field of about 100 microtesla is measured an inch away from an operating kitchen microwave.

"Magnetic field sensors are often required in handling various apparatuses in electric power systems, such as generators and motors," Mizuno said. "We anticipate that the merits of our sensor, including electrical insulation and long measurement range, can be exploited in such applications."

Leal-Junior also noted that the proposed sensor can be easily fabricated at a low cost and that their approach paves the way for a novel recycling option by salvaging fused polymer optical fibers for use in magnetic field sensors.

The researchers are planning to improve the measurement accuracy as well as further enhance the sensitivity of the proposed sensor. They will also attempt to use the same approach to demonstrate electric field sensing in the near future.

Credit: 
Yokohama National University

A subtle change in the DNA may predispose to polyneuropathy after gut infection

image: Patients with GBS characteristically produce autoantibodies to sialic acid-containing glycolipids called gangliosides present in neuronal cells. These autoantibodies are thought to bind to gangliosides present on the surface of neurons, and cause paralysis and sensory disturbance by damaging neurons. Here we demonstrate that a rare variant of the molecule called Siglec-10 accumulates in patients with GBS. Siglec-10 is a cell surface molecule present in various immune cells including antibody-producing cells. Siglec-10 recognizes sialic acid-containing molecules including gangliosides as ligands, and is known to inhibit activation of antibody-producing cells upon interaction with the ligands. Normal Siglec-10 may inhibit activation of anti-ganglioside antibody-producing cells when these cells interact with gangliosides present in neurons. In contrast, the Siglec-10 variant accumulated in patients with GBS shows impaired binding to gangliosides. Therefore, this Siglec-10 variant may fail to inhibit activation of anti-ganglioside antibody-producing cells, thereby allowing production of anti-ganglioside autoantibodies.

Image: 
Department of Immunology, TMDU

Tokyo, Japan - Guillain-Barré syndrome is an infamous autoimmune neuropathy, yet genetic variants predisposing individuals to this disease have yet to be described. In a new study, researchers from Tokyo Medical and Dental University (TMDU) discovered two novel genetic variants in a protein made by antibody-forming immune cells, providing a mechanism for the development of the disease.

The body's immune system is supposed to fight off invaders; however, in autoimmune diseases this defense goes rogue and attacks the host instead through the production of autoantibodies. Guillain-Barré syndrome (GBS) is an acutely developing, autoimmune peripheral neuropathy that leads to muscle weakness and numbness. It is based on the production of autoantibodies against gangliosides, a specific type of lipid molecules on the membranes of cells of the nervous system, which in turn damage those neurons and result in polyneuropathy. That is, damage to multiple peripheral nerves that may cause muscle weakness and numbness. GBS is often preceded by an immune stimulation such as an infection. Indeed, infection with the bacterium Campylobacter jejuni, causing a diarrheal illness, is the most common event before GBS develops. However, as not all patients with this bacterial illness then develop GBS, it has long been thought that genetic variants (small differences in the DNA between individuals) may be what predisposes patients with GBS to the polyneuropathy.

"GBS remains somewhat of a medical mystery. We do not fully understand why patients develop this disease," says corresponding author of the study Professor Takeshi Tsubata. "The goal of our study was to identify genetic variants in patients with GBS and provide a potential mechanism for the production of autoantibodies that lead to the development of polyneuropathy in these patients."

To achieve their goal, the researchers focused on the protein Siglec-10. Siglec-10 is produced by B lymphocytes, a specific type of immune cell that produces antibodies, and binds to gangliosides. The researchers hypothesized that Siglec-10 may play an inhibitory role in the production of antibodies against gangliosides, and in turn that genetic variants in Siglec-10 may diminish this inhibitory role and thus facilitate the development of GBS. By analyzing the DNA sequence encoding for the protein Siglec-10 in patients with GBS, the researchers identified two rare variants that change the amino acid sequence in the protein in GBS patients. There were no patients with only one of the two variants probably because these two variants are located very closely in the Siglec-10 gene.

The researchers then made the GBS-specific Siglec-10 protein in the lab to understand how it differs from the normal Siglec-10 protein at the molecular level. They found that only one of the two variants was responsible for the deleterious effects of the alternate protein, causing a marked alteration in the molecular structure of the protein and in turn a significant impairment of the protein to bind gangliosides.

"These are striking results that show how Siglec-10 suppresses antibody production of gangliosides, and in turn how a variant protein may predispose to the development of Guillain-Barré syndrome. These findings help us understand the pathophysiology of the disease," says first author of the study Amin Alborzian Deh Sheikh.

Credit: 
Tokyo Medical and Dental University

Study explains role of bone-conducted speech transmission in speech production and hearing

image: A loudspeaker was embedded in an enclosure fitted to the hard palate to make a quasi-static sound field in the OC. The probe microphone was placed in the sound field for measuring the response signal.

Image: 
Masashi Unoki from Japan Advanced Institute of Science and Technology

The perception of our own voice depends on sound transmission through air (air-conducted) as well as through the skull bone (bone-conducted or BC). The transmission properties of BC speech are, however, not well understood. Now, scientists from Japan Advanced Institute of Science and Technology report their latest findings on BC transmission under the influence of oral cavity sound pressure, which can boost BC-based technology and basic research on hearing loss and speech impairment.

Ever wondered why your voice sounds different in a recording compared to how you perceive it as you speak? You are not alone. The reason has to do with the two different types of transmission of our own voice, namely, air-conducted (AC) speech and bone-conducted (BC) speech. In the case of AC speech, the voice is transmitted through the air via lip radiation and diffraction, whereas for BC speech, it is transmitted through the soft tissue and the skull bone. This is why when we hear ourselves in a recording, we only perceive the AC speech, but while speaking, we hear both the AC and the BC speech. In order to understand, then, the relationship between speech production and perception, both these speech transmission processes need to be accounted for.

This has been further corroborated by recent scientific investigations which show that BC speech transmission affects the perception of our own voice similarly to AC speech transmission. However, the transmission process of BC speech remains to be understood accurately.

In a new study published in Journal of Speech, Language, and Hearing Research, a team of scientists from Japan Advanced Institute of Science and Technology (JAIST) attempted to understand the BC speech transmission process by studying the vibrations of the regio temporalis (or RT, the temple region of the head) and the sound radiation in the ear canal (EC) caused by sound pressure in the oral cavity (OC). Professor Masashi Unoki of JAIST, who was involved in the study, outlines their approach, "We assumed a transmission pathway model for BC speech in which sound pressure in the OC is assumed to cause vibration of the soft tissue and the skull bone to reach the outer ear. Based on this assumption, we focused on how excitations in the OC would affect the BC transmission to the RT and the EC."

For measuring BC transmission, the scientists selected five university students (three men and two women) aged 23-27 years with normal hearing and speaking ability. In each participant, they fitted a small loudspeaker to their hard palate (the structure that sits at the front of the roof of the mouth) and then transmitted computer-generated excitation signals to them. The response signals were simultaneously recorded on the skin in their left RTs and right ECs with a BC microphone and a probe microphone, respectively. Each participant underwent 10 measurement trials.

The team found, upon analyzing the transfer function (which models the frequency response of a system), that the RT vibrations support all frequencies up to 1 KHz while the EC pressure accentuates frequencies in the 2-3 KHz range. Combining this observation with an earlier report which showed that BC speech is perceived to be "low pitch", or dominated by low frequencies, the team concluded that the EC transmission, which cuts off both very low and very high frequencies, does not play a major role in BC speech perception.

The results have excited scientists who foresee several applications of BC transmission in future technology. As Prof. Unoki surmises, "Our findings can be used to develop BC-based headphones and hearing aids in the future as well as provide speech on BC microphones and speakers currently employed by smart wearable devices. In addition, it can be used to investigate auditory feedback in BC speech communications that can influence basic research on hearing loss and speech disorders."

Based on these promises, it looks like future technology might one day even help us express ourselves better!

Credit: 
Japan Advanced Institute of Science and Technology

Autoimmune diseases: similar molecular signatures in target tissues

Autoimmune diseases are diseases of "mistaken identity", where the immune system - which is supposed to protect us against infectious diseases and neoplasias - mistakenly attacks and destroys components of our own body. The incidence of autoimmune diseases is increasing on a worldwide basis, and these diseases - including type 1 diabetes (T1D), systemic lupus erythematosus (SLE), multiple sclerosis (MS) and rheumatoid arthritis (RA) - now affect up to 5% of the population in different regions. There is no cure for autoimmune diseases, and while the immune target of T1D, SLE, MS, and RA are distinct, they share several similar elements, including up to 50% common genetic risk, chronic local inflammation and mechanisms mediating target tissue damage.

In spite of these common features, autoimmune disorders are traditionally studied independently and with a focus on the immune system rather than on the target tissues. There is, however, increasing evidence that the target tissues of these diseases are not innocent bystanders of the autoimmune attack but participate in a deleterious dialogue with the immune system that contributes to their own demise, as first shown by the Eizirik's group for T1D. Furthermore, in T1D, several of the risk genes for the disease act at the target tissue level - in this case pancreatic β-cells - regulating the responses to viral infections, the dialogue with the immune system and apoptosis. Against this background, the Authors hypothesized that key inflammation-induced mechanisms, potentially shared between T1D, SLE, MS and RA, may drive similar molecular signatures at the target tissue level. Discovering these similar (or, in some cases, divergent) disease-specific signatures may allow the identification of key pathways that could be targeted for therapy, including the re-purposing of drugs already in clinical use for other diseases.

To test this hypothesis, they obtained RNA sequencing datasets (i.e. studies where all genes expressed on a diseased tissue, as compared to a healthy one, are identified) from pancreatic β-cells from controls or individuals affected by T1D, from kidney cells from controls or individuals affects by SLE, from optic chiasm from controls or individuals affected by MS and from joint tissue from controls or individuals affected by RA. These studies indicate major common gene expression changes at the target tissues of the four autoimmune disease evaluated, many of them downstream of interferons, and massive expression of candidate genes (>80% in all cases). One candidate gene in common between the four diseases is TYK2, a protein that regulates interferon signaling, and the Eizirik's group showed that use of TYK2 inhibitors - already in use for other autoimmune diseases - protect beta-cells against immunemediated damage in pre-clinical models of diabetes.

Credit: 
Université libre de Bruxelles

Investment risk & return from emerging public biotech companies comparable to non-biotech

image: Shareholder value at the end of 2016 compared to IPO for biotech companies and paired controls completing IPOs 1997-2016. Net growth in shareholder value, and the fraction of companies generating growth, were comparable for biotech and non-biotech companies.

Image: 
Center for Integration of Science and Industry, Bentley University

Investing in biotech companies may not entail higher risk than investing in other sectors, according to a new report from Bentley University's Center for Integration of Science and Industry. A large scale study of biotechnology companies that completed Initial Public Offerings (IPOs) from 1997-2016 demonstrates that these companies produced more than $100 billion in shareholder value and almost $100 billion in new value creation despite a failure rate greater than 50%. The study compared the financial performance and economic value created by these biotech companies to non-biotechnology controls that had similarly timed IPOs.

The findings are published in PLOS ONE in the article "Comparing long-term value creation after biotech and non-biotech IPOs, 1997-2016." Researchers studied the financial performance of 319 biotechnology companies focused on developing therapeutic products that completed IPOs from 1997-2016. These emerging, public biotech companies characteristically had little revenue, high R&D expense, and negative profit (losses) in contrast to non-biotechnology companies, which generated substantial revenue, little or no R&D, and consistent profit after 2001. Despite these differences, the study demonstrates that paired biotech and non-biotech companies had similar rates of success and failure. The analysis shows that 42% of biotechs and 40% of non-biotechs reached $1 billion valuations, while 53% of biotechs and 51% of non-biotechs failed to maintain their IPO market value. Companies also generated similar amounts of shareholder value, though non-biotechnology companies achieved greater net capital creation.

"Despite the extraordinary performance of the biotech sector in recent years, biotech is still often portrayed as a high risk investment. Our study suggests that, in fact, the high-risk, high-return pattern associated with biotech companies after their IPO is a common characteristic of companies completing IPOs from other sectors as well." said Dr. Fred Ledley, Director of the Center for Integration of Science and Industry. "This study also suggests that, for these emerging, public companies, the science-based, biotech business model generates equivalent economic value to more traditional, revenue-based business models."

This study is one part of a large survey of the finances and late stage product portfolios of companies that completed their IPOs over recent decades. These studies show that, while more than 50% of emerging, public biotechnology companies had products that achieved FDA approval, most companies developing products were acquired within several years of their first product approvals, and only four companies achieved revenues of $1 billion by the end of 2016. Thus, despite considerable success in product development and substantial value generation, biotech IPOs from 1997-2016 generated few new large, biopharmaceutical companies.

Credit: 
Bentley University

Old silicon learns new tricks

image: (a) Wide and (b) magnified images of the fabricated Si pyramids. Four slopes correspond to Si{111} facet surfaces.

Image: 
Ken Hattori

Ikoma, Japan - Ultrasmall integrated circuits have revolutionized mobile phones, home appliances, cars, and other everyday technologies. To further miniaturize electronics and enable advanced functions, circuits must be reliably fabricated in three dimensions. Achieving ultrafine 3D shape control by etching into silicon is difficult because even atomic-scale damage reduces device performance. Researchers at Nara Institute of Science and Technology (NAIST) report, in a new study seen in Crystal Growth and Design, silicon etched to adopt the shape of atomically smooth pyramids. Coating these silicon pyramids with a thin layer of iron imparts magnetic properties that until now were only theoretical.

NAIST researcher and senior author of the study Ken Hattori is widely published in the field of atomically controlled nanotechnology. One focus of Hattori's research is in improving the functionality of silicon-based technology.

"Silicon is the workhorse of modern electronics because it can act as a semiconductor or an insulator, and it's an abundant element. However, future technological advances require atomically smooth device fabrication in three dimensions," says Hattori.

A combination of standard dry etching and chemical etching is necessary to fabricate arrays of pyramid-shaped silicon nanostructures. Until now, atomically smooth surfaces have been extremely challenging to prepare.

"Our ordered array of isosceles silicon pyramids were all the same size and had flat facet planes. We confirmed these findings by low-energy electron diffraction patterns and electron microscopy," explains lead author of the study Aydar Irmikimov.

An ultrathin - 30 nanometer - layer of iron was deposited onto the silicon to impart unusual magnetic properties. The pyramids' atomic-level orientation defined the orientation-and thus the properties-of the overlaying iron.

"Epitaxial growth of iron enabled shape anisotropy of the nanofilm. The curve for the magnetization as a function of the magnetic field was rectangular-like shaped but with breaking points which were caused by asymmetric motion of magnetic vortex bound in pyramid apex," explains Hattori.

The researchers found that the curve had no breaking points in analogous experiments performed on planar iron-coated silicon. Other researchers have theoretically predicted the anomalous curve for pyramid shapes, but the NAIST researchers are the first to have shown it in a real nanostructure.

"Our technology will enable fabrication of a circular magnetic array simply by tuning the shape of the substrate," says Irmikimov. Integration into advanced technologies such as spintronics - which encode information by the spin, rather than electrical charge, of an electron - will considerably accelerate the functionality of 3D electronics.

Credit: 
Nara Institute of Science and Technology