Tech

New method makes realistic water wave animations more efficient

image: The simulation covers a large area and is able to accurately model the interactions and in a computationally efficient manner.

Image: 
Camille Schreck

Producing high-quality and realistic water wave animations that interact with complex objects is often computationally expensive with designers frequently opting for methods that are fast to compute but of lower quality. Researchers at the Institute of Science and Technology Austria (IST Austria) have developed a technique to produce more realistic water wave animations at a similar computational expense as compared to current approaches. The results are published in the journal ACM - Transactions on Graphics today.

In general, water wave simulations are based on one of two available methods. 'Fourier-based' methods are efficient but cannot model complicated interactions, such as water hitting the shore of an island. More elaborate, 'numerical' techniques, on the other hand, can simulate a wide range of such effects but are much more expensive computationally. As a result, detailed scenes such as ripples forming as a wave interacts with an island or even a boat passing by are practically impossible due to the sheer processing time and computational power needed. Computer Scientists from Christoph Wojtan's research team at IST Austria have now developed a method that makes it possible to animate realistic waves and their interaction with solid objects, at a large scale and also in a computationally efficient manner.

From theory to industry

Achieving this feat required innovation and a deep understanding of the physics involved by solving complex mathematical equations that model wave-surface interactions. With SIGGRAPH 2019, the annual conference on computer graphics convened by the ACM SIGGRAPH (Special Interest Group on Computer GRAPHics and Interactive Techniques) organization coming up at the end of July, the team will showcase their methods at the conference and see numerous applications to the movie and video game industry. Scenes such as boats moving past islands and rain droplets hitting water are possible with this method and are now demonstrated to be computationally efficient to do so.

Credit: 
Institute of Science and Technology Austria

Foundational study explores role of diet in diabetes complications

image: A Western diet significantly increased immune cell accumulation (indicated by white arrowheads) in the retinal microvessels of diabetic rats (with permission from Barakat et al.). What component of the diet causes this is being pursued here.

Image: 
Barakat et al

Type 1 and type 2 diabetes affect the health of the inner lining of blood vessels. People with diabetes often experience complications in the eyes, heart, and other organs because of worsening blood vessel damage over the long term. One of the earliest signs of systemic inflammation in the blood vessels is the increased sticking of immune cells to the inner lining. As inflammation and microvascular damage continues in the light-sensitive tissue in the back of the eye -- the retina -- diabetic retinopathy can ensue. Diabetic retinopathy is a leading cause of severe vision loss and blindness. A pressing question in diabetes research is how elevated blood levels of sugar, cholesterol, and fat may contribute to blood vessel damage in relation to the diet. A new study by investigators from Brigham and Women's Hospital set out to determine which components of the Western diet -- one rich in sugar, cholesterol and fat -- may worsen diabetes complications. The team examined the effects of different dietary fats on the earliest molecular signs of retinal inflammation and damage in an experimental rodent model of type 1 diabetes. The results are published in The FASEB Journal.

"Solid information about the effects of nutrition on disease development or progression is a rarity, but foundational work in preclinical models can help set the stage for clinical implications," said corresponding author Ali Hafezi-Moghadam, MD, PhD, Director of the Molecular Biomarkers Nano-Imaging Laboratory at the Brigham and Associate Professor of Radiology at Harvard Medical School. "We want to understand who is at risk for diabetic retinopathy and what dietary steps can be taken to slow down disease progression, but to take those steps, we must first understand the effects and interplay of the various components of diet."

To do so, the team used an established rat model of type 1 diabetes, known as streptozotocin (STZ)-diabetic rats. This model is characterized by the inability to produce insulin and by elevated levels of sugar and fat in the blood. The research team generated high-fat diets with varying fatty acid compositions, moderate amounts of carbohydrates and no sugars to tease out the effects of specific dietary components on the diabetic vascular damage. The team fed these diets to the STZ-diabetic rats and then examined the accumulation of immune cells and other related readouts in the retinal blood vessels.

To examine the rat retina, the team previously developed a unique nanoprobe-based molecular imaging technique. The nanoprobes injected into the blood stream of the rats targeted specific molecules to which immune cells bind in the retina. Using laser-scanning confocal microscopy in live animals, the team produced images from the rats' retinas that visualized the accumulation of the nanoprobes. Hafezi-Moghadam likens the image of the brightly fluorescing nanoprobes in the retina to a "starry sky" at night, where "the number of stars tells us a whole lot about the condition of the retina."

The investigators found that neither high levels of saturated nor unsaturated fats increased retinal damage in this animal model, but that the combination of high levels of dietary cholesterol with specific saturated fatty acids that are abundant in the Western diet exacerbated the damage.

Elevated blood sugar (hyperglycemia) is a common symptom of type 1 and type 2 diabetes, however the diseases have different mechanisms. Because diabetes complications in patients are often clinically observed after long exposure to hyperglycemia, the study of the mechanisms of complications in animal models has traditionally put less emphasis on the manner in which the animals develop hyperglycemia. The lab introduced and is currently developing a realistic model of type 2 diabetes known as the Nile Grass Rat. In the future, the team will leverage this model and explore the contributions of other dietary components to vascular damage in type 2 diabetes.

"This work lays the foundation for further examination of the relationship between levels of fat in the blood, dietary fats, and the development of diabetes complications," said lead author Aliaa Barakat, PhD, a senior research scientist in the Molecular Biomarkers Nano-Imaging Laboratory at the Brigham. "Dietary carbohydrates and dietary fats have related and overlapping metabolic effects. Future experiments are warranted across a spectrum of hormonal changes characteristic of treated type 1 diabetes and treated and untreated type 2 diabetes. Subsequent work will also address mechanisms behind our findings involving the interaction between dietary sugar, cholesterol and saturated fatty acids."

Credit: 
Brigham and Women's Hospital

Barbara now a major hurricane on NASA satellite imagery

image: On July 2, 2019, the MODIS instrument aboard NASA's Terra satellite provided a visible image of Hurricane Barbara in the Eastern Pacific Ocean, now far from western Mexico.

Image: 
NASA Worldview, Earth Observing System Data and Information System (EOSDIS)

NASA's Terra and Aqua satellites passed over the Eastern Pacific Ocean after Tropical Storm Barbara strengthened into the first hurricane of the season. Barbara intensified rapidly into a major hurricane.

NOAA's National Hurricane Center (NHC) noted that Barbara intensified early during the morning of July 2 and could strengthen a little more. Fortunately, Barbara is over 1,000 miles west of the southern tip of Baja California, and there are no coastal watches or warnings in effect.

On July 2, the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Terra satellite provided a visible image of Barbara that showed powerful thunderstorms circling an eye. Bands of thunderstorms wrapped into the center from the southern and eastern quadrants.

An infrared look by NASA's Aqua satellite on July 2, at 5:17 a.m. EDT (0917 UTC revealed where the strongest storms were located within Hurricane Barbara. The Atmospheric Infrared Sounder or AIRS instrument aboard NASA's Aqua satellite analyzed cloud top temperatures and found cloud top temperatures of strongest thunderstorms as cold as or colder than minus 81.6 degrees Fahrenheit (minus 63.1 degrees Celsius) circling the eye, which was seen in a lighter color in a false-colored NASA image. Cloud top temperatures that cold indicate strong storms that have the capability to create heavy rain.

At 11 a.m. EDT (1500 UTC), the NHC noted the eye of Hurricane Barbara was located near latitude 12.5 degrees north and longitude 122.2 degrees west. Barbara is about 1,080 miles or 1,740 km southwest of the southern tip of Baja California, Mexico. Barbara is moving toward the west-northwest near 14 mph (22 kph). Barbara is forecast to slow in forward speed later today and then turn toward the northwest in a day or two. The estimated minimum central pressure is 948 millibars (28.00 inches).

Satellite data indicate that the maximum sustained winds have increased to near 130 mph (215 kph) with higher gusts.  Barbara is a category 4 hurricane on the Saffir-Simpson Hurricane Wind Scale. Hurricane-force winds extend outward up to 25 miles (35 km) from the center and tropical-storm-force winds extend outward up to 185 miles (295 km).

Some additional strengthening is possible today, but weakening is likely to begin on Wednesday and continue into Thursday.

Credit: 
NASA/Goddard Space Flight Center

Russian engineers ready to 'light up' a lamp revolution

image: Lamp revolution.

Image: 
@tsarcyanide/MIPT Press Office

Researchers from the Moscow Institute of Physics and Technology and Lebedev Physical Institute of the Russian Academy of Sciences have designed and tested a prototype cathodoluminescent lamp for general lighting. The new lamp, which relies on the phenomenon of field emission, is more reliable, durable, and luminous than its analogues available worldwide. The development was reported in the Journal of Vacuum Science & Technology B.

While LED lamps have become commonplace, they are not the only clean and power-saving alternative to incandescent lamps. Since the 1980s, engineers around the world have been looking into the so-called cathodoluminescent lamps as another option for general lighting purposes.

Shown in figure 1, a lamp of this kind relies on the same principle that powered the old TVs using cathode-ray tubes: A negatively charged electrode, or cathode, at one end of a vacuum tube serves as an electron gun. A potential difference of up to 10 kilovolts accelerates the emitted electrons toward a flat positively charged phosphor-coated electrode -- the anode -- at the opposite end of the tube. This electron bombardment results in light.

Cathodoluminescent lamps have the advantage of being able to emit light almost at any wavelength, from the red to ultraviolet, depending on which fluorescent material is used.

Novel ultraviolet light bulbs would be a particularly timely development, considering the recent ban on household appliances using mercury under the Minamata Convention, a United Nations treaty signed by 128 countries that came into effect in August 2017. Among other products, the ban targets ultraviolet fluorescent tubes, widely used for greenhouse lighting and other applications. Cathodoluminescent UV light bulbs contain no mercury and are generally cleaner in service and upon disposal.

"Some industries using mercury lamps for water treatment and air disinfection, for example, will be very slow and unwilling to phase them out," commented Mikhail Danilkin of Lebedev Physical Institute, RAS. "But medicine is different, because the issue of mercury lamp disposal at individual medical facilities has not been resolved, while the environmental standards are becoming stricter. Cathodoluminescent lamps could be used in operating room decontamination, UV irradiation of throat and tonsils, and dental filling curing."

Another important advantage of the new lamp over LEDs and fluorescent bulbs is that it does not rely on the so-called critical raw materials. These include gallium, indium, and some rare-earth elements. While their supply is limited, these materials are essential and irreplaceable in the health, defense, aerospace, and other key industries. The European Commission lists them as strategically important for the European economy.

Attempts to mass-produce commercial cathodoluminescent light bulbs have been made in the United States, but the consumers did not embrace the device, mostly because it was bulky and took several seconds to warm up the cathode to operating temperature. Similarly, the old TV sets began displaying the image after a brief delay.

Some cathodes require no warmup, though. They are known as field emission cathodes, because they rely on the phenomenon of field emission. It involves a cold cathode emitting electrons under an electrostatic field alone, due to tunneling.

However, designing an efficient, long-lasting, and technologically advanced cathode that could be mass-produced and sold at an affordable price has proved challenging. Despite an ongoing effort in Japan and the U.S., the recent Russian study marks the first successful attempt at this.

"Our field emission cathode is made of ordinary carbon," said Professor Evgenii Sheshin, deputy chair of vacuum electronics at MIPT, who led the research team. "But this carbon is not used merely as a chemical, but rather as a structure. We found a way to fashion a structure from carbon fibers that is resistant to ion bombardment, outputs a high emission current, is technological and affordable in production. This technology is our know-how, no one else in the world has it."

By subjecting the carbon to special treatment, many submicrometer protrusions -- less than a millionth of a meter in size -- are formed at the tip of the cathode (figure 2). This results in an ultrahigh electric field at the tip, driving electrons out, into the vacuum.

The MIPT research group has also developed a compact power source for their cathodoluminescent lamp, which supplies enough kilovolts for successful field electron emission. The source is fitted around the glass light bulb (figure 3) with almost no effect on its size.

Figure 3. Laboratory prototypes of cathodoluminescent bulbs with a built-in voltage converter for an E27 cap with a diffuser (a) and without it (b). The luminous power is up to 250 lumens, which is about the output of a 25-watt incandescent lamp, but the power consumption is only 5.5 watts. Image courtesy of the researchers

The paper reports prototype tests and the lamp's technical characteristics. These data suggest that if mass-produced, the new cathodoluminescent bulb could compete with the cheap lamps based on light-emitting diodes. The new bulb would also help phase out the hazardous fluorescent lamps containing mercury, which are still used in many households.

"Unlike the LED bulb, our lamp is not afraid of elevated temperatures. You can use it where diodes quickly fade, such as in ceiling spotlights, where insufficient cooling is provided," added study co-author Dmitry Ozol from MIPT's vacuum electronics department.

Credit: 
Moscow Institute of Physics and Technology

Physicists OK commercial graphene for T-wave detection

image: This is a graphene-based transistor with a metal grating.

Image: 
Andrey Bylinkin et al./Physical Review Applied

Russian researchers from the Moscow Institute of Physics and Technology (MIPT) and Valiev Institute of Physics and Technology have demonstrated resonant absorption of terahertz radiation in commercially available graphene. This is an important step toward designing efficient terahertz detectors, which would enable faster internet and a safe replacement for X-ray body scans. The research findings were published in Physical Review Applied.

Graphene optoelectronics

Since Andre Geim and Kostya Novoselov received the 2010 Nobel Prize in physics for studying the unique electronic properties of graphene, the interest toward this material has never waned. Graphene is truly two-dimensional: It consists of a one-atom-thick layer of carbon, which is one of the reasons why its properties are so amazing. It is thin but mechanically strong, impermeable even to helium atoms, and conducts electricity and heat extremely well. The high mobility of electrons in graphene makes it a promising material for ultrafast photodetectors, including those operating in the terahertz range.

THz radiation, also known as T-waves, is equally difficult to generate and to detect. This gave rise to the notion of a "terahertz gap," which refers to the roughly 0.1-10 THz frequency band in the electromagnetic spectrum. There are no efficient devices for generating and detecting radiation in this range. Nevertheless, T-waves are very important for humanity: They do not harm the body and so could replace X-rays in medical scans. Also, T-waves could make Wi-Fi much faster and unlock a poorly studied band of cosmic radiation for astronomical research.

Despite the great potential of graphene for photodetection, its monolayer by itself absorbs only about 2.3% of external radiation, which is not enough for reliable detection. A way around this is to strongly localize the field near graphene, forcing an electromagnetic wave to couple with graphene electrons and excite resonant oscillations. The resulting collective wave of the electromagnetic field and conduction electrons is known as a surface plasmon. The corresponding phenomenon of plasmon resonance is the enhanced light absorption due to the excitation of surface plasmon waves.

Unfortunately, this phenomenon is not observed in a continuous sheet of a conductor illuminated with plane waves. The plasmon wavelength is too short compared with that of the photon, that's why these two waves can hardly be synchronous. To address this disparity, a metal grating is placed above the graphene film. It resembles a tiny comb with teeth less than a micrometer apart.

Graphene: Expectations vs. reality

Dozens of techniques are available for producing graphene. They differ in terms of end product quality and labor intensity. Researchers praising the high electron mobility in graphene have often played down how difficult this material is to manufacture.

The highest-quality graphene is produced by mechanical exfoliation. This involves placing a piece of graphite between two sticky tapes, which then rip off progressively thinner layers in multiple iterations. At some point, fragments of graphene -- that is, monolayer graphite -- emerge. Such "handmade" graphene has the best characteristics for applied devices, such as the resonant T-wave detector based on encapsulated graphene created by researchers from MIPT, Moscow State Pedagogical University, and the University of Manchester. Unfortunately, graphene flakes manufactured by mechanical exfoliation are only micrometers across, take several months to produce, and end up too expensive for serial device design.

There is an easier and scalable alternative technique for graphene synthesis called chemical vapor deposition (CVD). It involves decomposing gases -- normally, a mix of methane, hydrogen, and argon -- in a special furnace. The process leads to a graphene film forming on a copper or nickel substrate. The resulting graphene has poorer characteristics and more defects than the mechanically exfoliated one. But CVD is currently the technology best-suited for scaling up device production.

The Russian physicists set out to test whether such commercial-grade graphene is good enough for THz plasmon resonance excitation, which would make it a valid material for T-wave detectors.

"Actually, a CVD-produced graphene film is not homogeneous. Like a polycrystal, it consists of numerous merged grains. Each one is an ordered region with a completely symmetrical atomic pattern. Grain boundaries, along with defects, make working with such graphene far from easy," study co-author and MIPT graduate student Elena Titova said.

It took the team over a year to master working with CVD graphene at the Institute's Center of Shared Research Facilities. Meanwhile, the colleagues from the lab's theoretical department were convinced that no plasmon resonance would be observed. The reason is that resonance visibility is determined by the so-called quality factor -- that is, how many periods the field passes before the electron encounters a lattice defect. Theoretical estimates predicted a very low Q factor limited by frequent electron-defect collisions in CVD graphene. That said, the high electron mobility in graphene emerges not due to infrequent electron collisions, but due to a low mass of electrons, which enables their fast acceleration to a high velocity.

Theory and experiment

Despite the pessimistic theoretical predictions, the authors of the paper decided to still do the experiment. Their resolve was rewarded: The absorption spectra exhibited the peaks indicative of plasmon resonance in CVD-synthesized graphene.

"The thing is that not all defects are the same, and electrons collide with different defects in direct current measurements and THz absorption measurements," comments the research supervisor, Dmitry Svintsov, who heads the MIPT Laboratory of 2D Materials for Optoelectronics. "In a DC experiment, an electron will inevitably encounter grain boundaries on the way from one electrical contact to the other. But when exposed to T-waves, it will mostly fluctuate within a single grain, away from its boundaries. This means that defects impairing DC conductivity are actually 'safe' for T-wave detection."

A further mystery had to do with the frequency of resonant plasmon excitation, which disagreed with the previously existing theories. It turned out to be related to the geometry of the metal grating in an unexpected way. The team found that when positioned close to graphene, the grating (depicted in orange in figure 1) modified the plasmon field distribution. This led to plasmon localization under the "comb teeth," whose edges acted as mirrors for plasmons. The researchers formulated a very simple theory describing the phenomenon based on an analogy with the tight-binding model from solid-state physics. The theory reproduces the experimental data well without resorting to fitting parameters and can be used to optimize future T-wave detectors.

Credit: 
Moscow Institute of Physics and Technology

HIV eliminated from the genomes of living animals

image: Kamel Khalili, PhD, Laura H. Carnell Professor and Chair of the Department of Neuroscience, Director of the Center for Neurovirology, and Director of the Comprehensive NeuroAIDS Center at the Lewis Katz School of Medicine at Temple University

Image: 
Lewis Katz School of Medicine at Temple University

In a major collaborative effort, researchers at the Lewis Katz School of Medicine at Temple University and the University of Nebraska Medical Center (UNMC) have for the first time eliminated replication-competent HIV-1 DNA - the virus responsible for AIDS - from the genomes of living animals. The study, reported online July 2 in the journal Nature Communications, marks a critical step toward the development of a possible cure for human HIV infection.

"Our study shows that treatment to suppress HIV replication and gene editing therapy, when given sequentially, can eliminate HIV from cells and organs of infected animals," said Kamel Khalili, PhD, Laura H. Carnell Professor and Chair of the Department of Neuroscience, Director of the Center for Neurovirology, and Director of the Comprehensive NeuroAIDS Center at the Lewis Katz School of Medicine at Temple University (LKSOM). Dr. Khalili and Howard Gendelman, MD, Margaret R. Larson Professor of Infectious Diseases and Internal Medicine, Chair of the Department of Pharmacology and Experimental Neuroscience and Director of the Center for Neurodegenerative Diseases at UNMC, were senior investigators on the new study.

"This achievement could not have been possible without an extraordinary team effort that included virologists, immunologists, molecular biologists, pharmacologists, and pharmaceutical experts," Dr. Gendelman said. "Only by pooling our resources together were we able to make this groundbreaking discovery."

Current HIV treatment focuses on the use of antiretroviral therapy (ART). ART suppresses HIV replication but does not eliminate the virus from the body. Therefore, ART is not a cure for HIV, and it requires life-long use. If it is stopped, HIV rebounds, renewing replication and fueling the development of AIDS. HIV rebound is directly attributed to the ability of the virus to integrate its DNA sequence into the genomes of cells of the immune system, where it lies dormant and beyond the reach of antiretroviral drugs.

In previous work, Dr. Khalili's team used CRISPR-Cas9 technology to develop a novel gene editing and gene therapy delivery system aimed at removing HIV DNA from genomes harboring the virus. In rats and mice, they showed that the gene editing system could effectively excise large fragments of HIV DNA from infected cells, significantly impacting viral gene expression. Similar to ART, however, gene editing cannot completely eliminate HIV on its own.

For the new study, Dr. Khalili and colleagues combined their gene editing system with a recently developed therapeutic strategy known as long-acting slow-effective release (LASER) ART. LASER ART was co-developed by Dr. Gendelman and Benson Edagwa, PhD, Assistant Professor of Pharmacology at UNMC.

LASER ART targets viral sanctuaries and maintains HIV replication at low levels for extended periods of time, reducing the frequency of ART administration. The long-lasting medications were made possible by pharmacological changes in the chemical structure of the antiretroviral drugs. The modified drug was packaged into nanocrystals, which readily distribute to tissues where HIV is likely to be lying dormant. From there, the nanocrystals, stored within cells for weeks, slowly release the drug.

According to Dr. Khalili, "We wanted to see whether LASER ART could suppress HIV replication long enough for CRISPR-Cas9 to completely rid cells of viral DNA."

To test their idea, the researchers used mice engineered to produce human T cells susceptible to HIV infection, permitting long-term viral infection and ART-induced latency. Once infection was established, mice were treated with LASER ART and subsequently with CRISPR-Cas9. At the end of the treatment period, mice were examined for viral load. Analyses revealed complete elimination of HIV DNA in about one-third of HIV-infected mice.

"The big message of this work is that it takes both CRISPR-Cas9 and virus suppression through a method such as LASER ART, administered together, to produce a cure for HIV infection," Dr. Khalili said. "We now have a clear path to move ahead to trials in non-human primates and possibly clinical trials in human patients within the year."

Credit: 
Temple University Health System

Researchers cast neural nets to simulate molecular motion

image: New deep learning models predict the interactions between atoms in organic molecules. These models will help computational biologists and drug development researchers understand and treat disease.

Image: 
Los Alamos National Laboratory

LOS ALAMOS, N.M., July 2, 2019--New work from Los Alamos National Laboratory, the University of North Carolina at Chapel Hill, and the University of Florida is showing that artificial neural nets can be trained to encode quantum mechanical laws to describe the motions of molecules, supercharging simulations potentially across a broad range of fields.

"This means we can now model materials and molecular dynamics billions of times faster compared to conventional quantum methods, while retaining the same level of accuracy," said Justin Smith, Los Alamos physicist and Metropolis Fellow in the laboratory's Theoretical Division. Understanding how molecules move is critical to tapping their potential value for drug development, protein simulations and reactive chemistry, for example, and both quantum mechanics and experimental (empirical) methods feed into the simulations.

The new technique, called the ANI-1ccx potential, promises to advance the capabilities of researchers in many fields and improve the accuracy of machine learning-based potentials in future studies of metal alloys and detonation physics.

Quantum mechanical (QM) algorithms, used on classical computers, can accurately describe the mechanical motions of a compound in its operational environment. But QM scales very poorly with varying molecular sizes, severely limiting the scope of possible simulations. Even a slight increase in molecular size within a simulation can dramatically increase the computational burden. So practitioners often resort to using empirical information, which describes the motion of atoms in terms of classical physics and Newton's Laws, enabling simulations that scale to billions of atoms or millions of chemical compounds.

Traditionally, empirical potentials have had to strike a tradeoff between accuracy and transferability. When the many parameters of the potential are finely tuned for one compound, the accuracy decreases on other compounds.

Instead, the Los Alamos team, with the University of North Carolina at Chapel Hill and University of Florida, has developed a machine learning approach called transfer learning that lets them build empirical potentials by learning from data collected about millions of other compounds. The new approach with the machine learning empirical potential can be applied to new molecules in milliseconds, enabling research into a far greater number of compounds over much longer timescales.

Credit: 
DOE/Los Alamos National Laboratory

Study suggests genetic testing for young people diagnosed with type 1 diabetes

BOSTON -- (July 2, 2019) -- A Joslin Diabetes Center study among people treated for type 1 diabetes for many years has discovered that a minority may have monogenic diabetes, a non-autoimmune inherited condition that in some cases does not require insulin treatment.

"Our finding has clinical implications," says George L. King, MD, Joslin Senior Vice President and Chief Scientific Officer, and senior author on a paper describing the work published in the Journal of Clinical Investigation. "We are recommending that everyone under 18 who is diagnosed with type 1 diabetes be screened for monogenic diabetes, which is not being done at this time."

This result is part of an ongoing research initiative among Joslin Medalists, who have lived with type 1 diabetes or insulin-dependent diabetes for at least 50 years. The Joslin team also reported other significant discoveries about the activity of insulin-producing pancreatic beta cells over time in this population.

As the name suggests, monogenic diabetes is produced by a mutation in at least one gene that affects insulin secretion, explains Marc Gregory Yu, MD, first author on the paper. The condition makes up something between 1 and 5% of diabetes cases, many in a form known as mature onset diabetes of the young (MODY).

Yu worked with co-senior author Marcus Pezzolesi, PhD, and other Joslin colleagues to test for 29 genes implicated in monogenic diabetes, plus other genes known to help drive autoimmune type 1 diabetes.

Among 1,019 Medalists tested, about 8% had a monogenic diabetes mutation that might drive disease. Within that group, slightly less than half did not exhibit the genetic variations that needed to trigger type 1 diabetes--which suggested that they might respond well to oral drugs rather than only to insulin. In the remainder of the group, who displayed both types of genetic alterations, "we don't really know which genetic condition is causing their diabetes," King says.

Joslin investigators expect to launch a clinical trial within months to see if oral diabetes drugs can help Medalists with mutated monogenic diabetes genes manage their disease more effectively. "This will be the first clinical study looking at the administration of oral drugs in an older population with monogenic diabetes," Yu says. If the trial results are positive, they may suggest changes in care for tens of thousands of people among the million-plus individuals in the United States who have been diagnosed with type 1 diabetes.

In addition to their genetic analysis, the Joslin team made discoveries about the presence and behavior of beta cells both in living Medalists and in pancreases donated by many Medalists after death.

For decades, scientists believed that all beta cells eventually are destroyed in type 1 diabetes. However, earlier studies of Medalists and other sets of longtime survivors had proven that small numbers of the cells still appear in all individuals with type 1 diabetes. In the current study, Susan Bonner-Weir, PhD, a senior investigator in Joslin's Section on Islet Transplantation and Cell Biology, and co-workers confirmed this finding of beta cells in all of the 68 donated pancreases that were examined.

Additionally, the Joslin team reported discoveries from experiments among living Medalists in which these volunteers were given infusions that could stimulate their insulin production. Bonner-Weir showed that the results of these infusion tests matched up well with later analyses of beta cells in postmortem analysis of donated pancreases.

Moreover, examining how Medalists responded to insulin-stimulation testing over time produced one unexpected result. Typically, the low levels of insulin production in this population drop with age. However, a few Medalists actually increased their signs of insulin production when they repeated one stimulation test several years later. "These beta cell functions can come and go, which clearly shows that clinical trials studying ways to regenerate beta cells need to have a control group of participants," King says.

Credit: 
Joslin Diabetes Center

Getting more heat out of sunlight

image: The new aerogel insulating material is highly transparent, transmitting 95% of light. In this photo, parallel laser beams are used to make the material visible.

Image: 
Lin Zhao

CAMBRIDGE, MA -- A newly developed material that is so transparent you can barely see it could unlock many new uses for solar heat. It generates much higher temperatures than conventional solar collectors do -- enough to be used for home heating or for industrial processes that require heat of more than 200 degrees Celsius (392 degrees Fahrenheit).

The key to the process is a new kind of aerogel, a lightweight material that consists mostly of air, with a structure made of silica (which is also used to make glass). The material lets sunlight pass through easily but blocks solar heat from escaping. The findings are described in the journal ACS Nano, in a paper by Lin Zhao, an MIT graduate student; Evelyn Wang, professor and head of the Department of Mechanical Engineering; Gang Chen, the Carl Richard Soderberg Professor in Power Engineering; and five others.

The key to efficient collection of solar heat, Wang explains, is being able to keep something hot internally while remaining cold on the outside. One way of doing that is using a vacuum between a layer of glass and a dark, solar-absorbing material. This setup is used in many concentrating solar collectors, but it's relatively expensive to install and maintain. There has been great interest in finding a less expensive, passive system for collecting solar heat at the higher temperature levels needed for space heating, food processing, or many industrial processes.

Aerogels, a kind of foam-like material made of silica particles, have been developed for years as highly efficient and lightweight thermal insulating materials, but they have generally had limited transparency to visible light, with around a 70 percent transmission level. Wang says making aerogels that are transparent enough to work for solar heat collection was a long and difficult process involving several researchers for about four years. But the result is an aerogel that lets through over 95 percent of incoming sunlight while maintaining the material's highly insulating properties.

The key is in the precise ratios of the different chemicals used to create the aerogel, which is made by mixing a catalyst with a silicon-containing compound in a liquid solution, forming a kind of wet gel, and then drying it to get all the liquid out, leaving a matrix that is mostly air but retains the original mixture's structure. Producing a mix that chemically reacts much more quickly than those in conventional aerogels, they found, resulted in a gel with smaller pore spaces between its grains, and that therefore scattered the light much less.

In tests on a rooftop on the MIT campus, a passive device consisting of a solar-absorbing dark material covered with a layer of the new aerogel was able to reach and maintain a temperature of 220 C, in the middle of a Cambridge winter when the outside air was below 0 C.

Such high temperatures have previously only been practical by using concentrating systems, with mirrors to focus sunlight onto a central line or point, but this system requires no concentration, making it simpler and less costly. That could potentially make it useful for a wide variety of applications that require higher levels of heat.

For example, simple flat rooftop collectors are often used for domestic hot water, producing temperatures of around 80 C. But the higher temperatures enabled by the aerogel system could make such simple systems usable for home heating as well. Large-scale versions could be used to provide heat for a wide variety of applications in chemical, food production, and manufacturing processes.

Zhao describes the basic function of the aerogel layer as "like a greenhouse effect. The material we use to increase the temperature acts like the Earth's atmosphere does to provide insulation, but this is an extreme example of it."

For most purposes, the passive heat collection system would be connected to pipes containing a liquid that could circulate to transfer the heat to wherever it's needed. Alternatively, Wang suggests, for some uses the system could be connected to heat pipes, devices that can transfer heat over a distance without requiring pumps or any moving parts.

Because the principle is essentially the same, an aerogel-based solar heat collector could directly replace the vacuum-based collectors used in some existing applications, providing a lower-cost option. The materials used to make the aerogel are all abundant and inexpensive; the only costly part of the process is the drying, which requires a specialized device called a critical point dryer to allow for a very precise drying process that extracts the solvents from the gel while preserving its nanoscale structure.

Because that is a batch process rather than a continuous one that could be used in roll-to-roll manufacturing, it could limit the rate of production if the system is scaled up to industrial production levels. "The key to scaleup is how we can reduce the cost of that process," Wang says. But even now, a preliminary economic analysis shows that the system can be economically viable for some uses, especially in comparison with vacuum-based systems.

Credit: 
Massachusetts Institute of Technology

Astronomers help wage war on cancer

image: A model showing light (red/yellow) penetrating the surface of the human breast (white triangles).

Image: 
Tim Harries

Techniques developed by astronomers could help in the fight against breast and skin cancer. Charlie Jeynes at the University of Exeter will present his and Prof Tim Harries team's work today (3 July) at the RAS National Astronomy Meeting (NAM 2019) at the University of Lancaster.

A large part of astronomy depends on the detection and analysis of light. For example, scientists study the light scattered, absorbed and re-emitted in clouds of gas and dust, obtaining information on their interior.

Despite the vast differences in scale, the processes that light undergoes when travelling through the human body are very similar to those seen in space. And when things go wrong - when tissue becomes cancerous - that change should show up.

In the UK, nearly 60,000 women are diagnosed with breast cancer each year, and 12,000 die. Early diagnosis is key, with 90% of women diagnosed at the earliest stage surviving for at least five years, compared to 15% for women diagnosed with the most advanced stage.

Cancer creates tiny deposits of calcium in breasts, detected through a shift in the wavelength of light as it passes through the tissue. The Exeter team realised that the computer codes developed to study the formation of stars and planets could be applied to find these deposits.

Charlie commented: "Light is fundamental to a diverse range of medical advances, like measuring blood oxygenation in premature babies, or treating port-wine stains with lasers. So there is a natural connection with astronomy, and we're delighted to use our work to take on cancer."

Working with biomedical scientist Nick Stone, also at Exeter, the team are refining computer models to better understand how detected light is affected by human tissue. They eventually expect to develop a rapid diagnostic test that avoids unnecessary biopsies, improving the prospects for survival for thousands of women. Work is already underway with clinicians at Exeter's RD&E hospital to pilot the technology and pave the way for larger clinical trials

In a second project, the Exeter team are using computer models for a potential new treatment for non-melanoma skin cancer (NMSC). This is the most common type of cancer, with more than 80,000 cases reported in England each year. NMSC is expected to cost the NHS £180 million a year by 2020, a figure set to rise as the disease becomes more prevalent.

In a partnership with Alison Curnow of the University of Exeter Medical School, the scientists are using their code to develop a simulated 'virtual laboratory' for studying skin cancer treatment. The two-pronged attack looks at light-activated drugs (photodynamic therapy) and light-heated nanoparticles (photothermal therapy).

The simulation looks at how gold nanoparticles in a virtual skin tumour are heated by exposure to near-infrared light. After 1 second of irradiation, the tumour heats up by 3 degrees Celsius. After 10 minutes, the same tumour is heated by 20 degrees - enough to kill its cells. So far, photothermal therapy with nanoparticles has been effective in rats, but with the team's code to narrow down experimental conditions, they are working towards translating the technology for humans.

Charlie said: "Advances in fundamental science should never be seen in isolation. Astronomy is no exception, and though impossible to predict at the outset, its discoveries and techniques often benefit society. Our work is a great example of that, and I'm really proud that we're helping our medical colleagues wage war on cancer."

The next steps include using 3D- rendered models taken from images of real tumours, and simulating how these would respond to different treatment regimes. Data exists on how these tumours responded to treatment, which gives excellent 'ground truth' data against which to compare the models. In this way the team will be able to predict whether different types of treatment would be more effective for a particular tumour type, and enable clinicians to have more options when it comes to choosing a treatment plan.

Credit: 
Royal Astronomical Society

Maize-centric diet may have contributed to ancient Maya collapse

The question of how to best adapt to extreme climate is a critical issue facing modern societies worldwide. In "The Role of Diet in Resilience and Vulnerability to Climate Change among Early Agricultural Communities in the Maya Lowlands," published in Current Anthropology, authors Claire Ebert, Julie Hoggarth, Jaime Awe, Brendan Culleton, and Douglas Kennett examine the role of diet in the ability of the ancient Maya to withstand periods of severe climatic stress. The authors found that an increase in the elite Maya's preference for a maize-based diet may have made the population more vulnerable to drought, contributing to its societal collapse.

"Population expansion and anthropogenic environment degradation from agricultural intensification, coupled with socially conditioned food preferences, resulted in a less flexible and less resilient system," Ebert writes. "Understanding the factors promoting resilience in the past can help mitigate the potential for similar sudden and dramatic shifts in our increasingly interconnected modern world."

The study was conducted using the remains of 50 human burials from the ancient Maya community of Cahal Pech, Belize. Using AMS radiocarbon dating, Ebert and collaborators determined the age of the human burials found at Cahal Pech, both from the site core and surrounding settlements. These burials dated as early as the Middle Preclassic Period, between 735-400 B.C., and as late as the Terminal Classic, between approximately 800-850 A.D.

At the Human Paleoecology and Isotope Geochemistry Laboratory at Penn State University, Ebert measured stable carbon and nitrogen isotope values of the bone collagen in the burials to determine characteristics of individual diets and how they changed through time. Of particular interest was the increasing proportion of C4 plants in the diet, which includes the Maya staple crop maize.

For the burials dating to the Preclassic and Early Classic periods, representing the early inhabitants of the Cahal Pech, Ebert's results suggest that both elites and commoners had a diverse diet that, in addition to maize, included wild plants and animals procured through hunting. Ebert suggests that this diversity of food provided a buffer when a multi-century drought impacted the May lowlands between 300-100 B.C. "The resilience of complex social systems at Cahal Pech from the Preclassic through Early Classic was dependent in part upon a broad subsistence strategy that helped to absorb shocks to maize-based food production in the context of drought," Ebert writes.

Things took a turn at during the Terminal Classic period, between 750 and 900 A.D., when growing social hierarchies and population expansion led to the intensifying of agricultural production and increasing reliance on maize. During this time frame, Ebert found that humans from surrounding settlements at Cahal Pech had different carbon values than the site's center, where the elite class lived. "Our results show a pattern of highly restricted stable nitrogen and carbon isotopes for elite individuals in the Late and Terminal Classic, which corresponds to a hyper-specialized maize-based diet that persisted through the final abandonment of the site," Ebert writes. Elite demands on the local population for increased maize production, and a preference for this drought-intolerant crop, was likely a factor that contributed to the failure of the Cahal Pech socio-political system in the face of another severe drought at the end of the Terminal Classic Period.

"The study speaks to the importance of diet in the resilience and decline of ancient societies and contributes to our understanding of vulnerability to climate change among modern traditional farming communities as well as industrialized nations," Ebert writes.

Credit: 
University of Chicago Press Journals

Old-growth forest may provide valuable biodiversity refuge in areas at risk of severe fire

image: A northern spotted owl.

Image: 
USDA Forest Service photo by Damon Lesmeister

New findings show that old-growth forests, a critical nesting habitat for threatened northern spotted owls, are less likely to experience high-severity fire than young-growth forests during wildfires. This suggests that old-growth forest could be leveraged to provide valuable fire refuges that support forest biodiversity and buffer the extreme effects of climate change on fire regimes in the Pacific Northwest.

A recent study published in the journal Ecosphere examined the impact of the Douglas Complex and Big Windy fires that burned in the Klamath-Siskiyou region of Oregon during July 2013, a drought year. The fires burned through a long-term study area for northern spotted owls. Using information on forest vegetation before and after the fires, along with known spotted owl nesting areas, researchers had an unprecedented chance to compare the impact of wildfire on critical old-growth nesting habitat.

"On federally managed lands, spotted owl nesting habitat is largely protected from timber harvest under the Northwest Forest Plan, but wildfire is still a primary threat to the old-growth forest that spotted owls rely on for nesting habitat," said research wildlife biologist Damon Lesmeister. "The loss of spotted owl nesting habitat as a result of severe fire damage could have significant negative impacts on the remaining spotted owl populations as well as a large number of other wildlife species that rely on these old forests."

Old-growth forests have more vegetation than younger forests. Researchers expected that this meant more fuel would be available for wildfires, increasing the susceptibility of old-growth forests to severe fire, high tree mortality, and resulting loss of critical spotted owl nesting habitat. However, the data suggested a different effect.

Lesmeister and his colleagues classified fire severity based on the percentage of trees lost in a fire, considering forest that lost less than 20% of its trees to fire subject to low-severity fire and those with more than 90% tree loss subject to high-severity fire. They found that old-growth forest was up to three times more likely to burn at low severity--a level that avoided loss of spotted owl nesting habitat and is generally considered to be part of a healthy forest ecosystem.

"Somewhat to our surprise, we found that, compared to other forest types within the burned area, old-growth forests burned on average much cooler than younger forests, which were more likely to experience high-severity fire. How this actually plays out during a mixed-severity wildfire makes sense when you consider the qualities of old-growth forest that can limit severe wildfire ignitions and burn temperatures, like shading from multilayer canopies, cooler temperatures, moist air and soil as well as larger, hardier trees."

Because old-growth forests may be refuges of low-severity fire on a landscape that experiences moderate to high-severity fires frequently, they could be integral as biodiversity refuges in an increasingly fire-prone region. Leveraging the potential of old-growth forests to act as refuges may be an effective tool for forest managers as they deal with worsening fire seasons in the Pacific Northwest.

Credit: 
USDA Forest Service - Pacific Northwest Research Station

Graphenes now go monolayer and single crystalline

image: Figure 1: A large scale (~2.5 mm x 1.6 mm) SEM image of an adlayer-free single crystal graphene film on a Cu(111) foil.

Image: 
IBS

Director Rodney Ruoff's research group from the Center for Multidimensional Carbon Materials (CMCM) within the Institute for Basic Science (IBS) at the Ulsan National Institute of Science and Technology (UNIST) has reported a truly single layer (i.e., adlayer-free) large area graphene film on large area copper foils. This might seem like the latest in a series of seemingly similar declarations on single layer graphene. However, this achievement differs from other thousands of previous publications in that none of them had described truly single layer graphene over large area. Adlayers (bilayer or multilayer regions) have always been present in such films.

IBS scientists refined the chemical vapor deposition (CVD) growth method by eliminating all carbon impurities inside the copper foils on which graphene is grown. CVD on metal foils (especially copper foils) is currently the most promising route for scalable and reproducible synthesis of large-area graphene films of high quality. The team investigated why "adlayers" appeared in the graphene film grown on copper foils and found that the carbon impurities inside the foil directly lead to the nucleation and growth of adlayers. (Adlayers are regions in the film where there are 2 layers or 3 layers present--multilayer "patches", that is.)

"We discovered that the commercial copper foils have 'excess carbon' particularly near the surface--to a depth of about 300 nm, by using time-of-flight secondary ion mass spectrometry and combustion analysis. From a discussion with one technical expert in Jiangxi Copper Corporation Limited, one of the world's largest suppliers of copper foils, we learned that the carbon is embedded in the copper foil during manufacture, probably from hydrocarbon-based oil(s) used to lubricate rollers that the copper foil contacts at the high rolling temperatures", said Dr. Da Luo, first author of the article. After completely removing this carbon by annealing under H2 at 1060 °C, they were able to achieve adlayer-free and thus truly single layer graphene film.

By applying the same method, the IBS scientists also obtained an adlayer-free single layer and single crystal graphene film on a single crystal Cu foil. One of the first authors Meihui Wang explained, "We thus solved two problems that have persistently been present in prior syntheses of CVD graphene films (adlayers and grain boundaries (GBs)) at one time." Indeed, achieving perfect uniformity in the number of layers over a large area (single or double layers, for example) can be used to ensure a consistent device performance. Adlayer regions differ in, e.g., density and size when present in the active regions of devices. In addition to adlayers, GBs are present in polycrystalline graphene films prepared by CVD where graphene islands with different crystallographic orientations join to complete the film. The presence of GBs lowers carrier mobility and thermal conductivity, and reduces the mechanical strength.

Still, the scientists were left with one fascinating feature in their single crystal films: This single crystal graphene contains highly oriented parallel "folds" that are centimeters in length, roughly a hundred nanometers in width, and separated by 20 to 50 micrometers. Just like adlayers and GBs, folds were observed to significantly decrease the carrier mobility of graphene. To eliminate such scattering effects of adlayers, GBs and folds, the team patterned field-effect transistors in the region lying between two adjacent folds and with the transistors parallel to the folds. Unlike folds distributed quasi-randomly in the polycrystalline graphene film, the folds are highly aligned in the large area single crystal graphene film. This makes it easy to fabricate integrated high performance devices from the regions between the folds. Wang explained, "The region between two adjacent folds is 'clean' without any folds, adlayers, or GBs. This enabled the device to have very high electron and hole mobility. The field-effect transistors show very high room temperature carrier mobility values of around 1.0 x 104 cm2V-1s-1. Such high carrier mobility "translates" to various useful devices having high performance."

Director Ruoff noted, "Our approach towards large area adlayer-free single crystal graphene is a breakthrough. This uniform, 'perfect' single layer, single crystal graphene is expected to find use as an ultrathin support material for high-resolution transmission electron microscopy imaging, and in optical devices. Also as an appropriate graphene to achieve extremely uniform functionalization which leads to many other applications, particularly for sensors of various types. I also would like to note that we greatly valued the strong contributions by coauthors from UNIST, from HKUST, and from SKKU." This research was supported by the Institute for Basic Science, and has been published in the journal Advanced Materials.

Credit: 
Institute for Basic Science

Study finds dramatic differences in tests assessing preschoolers' language skills

image: Caitlin M. Imgrund, Ph.D., senior author and an assistant professor in FAU's Department of Communication Sciences and Disorders in the College of Education.

Image: 
Caitlin M. Imgrund, Ph.D.

About 1 in 10 babies in the United States is born premature. These children are at an increased risk for adverse outcomes across a broad spectrum of neurodevelopmental domains, including language skills. They also are at an increased risk for attention-deficit/hyperactivity disorder (ADHD) as well as other behavioral problems.

Preschool is a crucial time for language development. Children born preterm who display deficits in language skills are unlikely to catch up with their full-term peers. That's why it's imperative to accurately assess their language skills to determine if they need early intervention.

One common method of evaluating language skills is with standardized assessments or tests. Another way to analyze language skills is with language sample analysis, which provides a great deal of information on a child's language abilities and overall conversational skills. Despite this test's diagnostic utility, very few studies have analyzed language sample analyses in conjunction with standardized assessment outcomes in children born preterm.

To bridge this gap, a researcher at Florida Atlantic University's College of Education and collaborators, investigated the impact of preterm birth on language outcomes in preschoolers born preterm and full-term, using both standardized assessment and language sample analysis.

In addition to measures of language that included semantic skills as well as grammatical ability, the researchers examined nonlinguistic developmental skills of nonverbal intelligence, attention, and hyperactivity.

Results of the study, published in the Journal of Speech, Language and Hearing Research, show that in general, the children born preterm performed more poorly when language skill was measured via language sample analysis than standardized assessment. There were statistically significant group differences for all language skill measures obtained from the language sample analyses. In contrast, the researchers only found differences for one measure of language skills between the two groups of children obtained from the standardized assessments.

The researchers did not find any differences in the two groups of children for nonverbal factors such as ADHD and nonverbal intelligence in either the standardized assessments or the language sample analyses. In fact, none of the nonlinguistic skills accounted for a significant amount of the observed group differences on the language sample variables.

On the grammatical measures of the language sample analyses, the researchers found statistically significant differences between the two groups. Preterm children exhibited substantial grammatical difficulties. These children showed language deficits in discourse-level semantic and grammatical skills that were not evident from standardized assessment, which the researchers did not expect to find.

"Language difficulties at the discourse level may still exist even when children who are born preterm appear to be developing typically when they are evaluated by standardized assessments of global language ability, cognition, and attention," said Caitlin M. Imgrund, Ph.D., senior author and an assistant professor in FAU's Department of Communication Sciences and Disorders. "We also could not attribute these group differences to nonlinguistic factors such as inattention, hyperactivity, or nonverbal intelligence."

Findings from the study have important clinical implications for practitioners who work with preterm children. Deficits in conversational skills may be difficult to assess through the traditional use of standardized assessments, which underscores the importance of using both language sample analysis and standardized assessment to measure children's expressive language skills.

Credit: 
Florida Atlantic University

Newly-discovered 1,600-year-old mosaic sheds light on ancient Judaism

image: Elim mosaic detail, Huqoq Excavation Project. Copyright Jim Haberman. All rights reserved. Courtesy: UNC-Chapel Hill.

Image: 
Copyright Jim Haberman

For nine years running, Carolina professor Jodi Magness has led a team of research specialists and students to the ancient village of Huqoq in Israel's Lower Galilee, where they bring to light the remains of a Late Roman synagogue. For weeks during the summer, they unearth history in the form of art. With each excavation season, the students and researchers build on what little is known about the fifth century CE Jewish community of Huqoq and the artists who crafted depictions of biblical stories with tiny cubes of stone, or tesserae.

Jodi Magness, director of the Huqoq excavations and Kenan Distinguished Professor for Teaching Excellence in Early Judaism in the Department of Religious Studies in Carolina's College of Arts & Sciences, explains her team's newest findings and how the art they find connects them to texts written thousands of years ago.

Question: If you could name the biggest new discovery of this summer, what would it be?

Answer: I couldn't name just one from this summer's work, so how about two big discoveries?

First: Chapter 7 in the book of Daniel describes four beasts which represent the four kingdoms leading up to the end of days. This year our team discovered mosaics in the synagogue's north aisle depicting these four beasts, as indicated by a fragmentary Aramaic inscription referring to the first beast: a lion with eagle's wings. The lion itself is not preserved, nor is the third beast. However, the second beast from Daniel 7:4 - a bear with three ribs protruding from its mouth - is preserved. So is most of the fourth beast, which is described in Daniel 7:7 as having iron teeth.

Second: We've uncovered the first depiction of the episode of Elim ever found in ancient Jewish art. This story is from Exodus 15:27. Elim is where the Israelites camped after leaving Egypt and wandering in the wilderness without water. The mosaic is divided into three horizontal strips, or registers. We see clusters of dates being harvested by male agricultural workers wearing loincloths, who are sliding the dates down ropes held by other men. The middle register shows a row of wells alternating with date palms. On the left side of the panel, a man in a short tunic is carrying a water jar and entering the arched gate of a city flanked by crenellated towers. An inscription above the gate reads, "And they came to Elim."

Q: A lot of previous discoveries give so much context for this period. What questions do this year's findings prompt for you?

A: The Daniel panel is interesting because it points to eschatological, or end of day, expectations among this congregation. The Elim panel is interesting as it is generally considered a fairly minor episode in the Israelites' desert wanderings ¬¬- which raises the question of why it was significant to this Jewish congregation in Lower Galilee.

Q: Can you describe the, "Wow- look at this!" moment of this year's dig?

A: The "Wow!" moment came when we understood that the animals depicted in the mosaic in the north aisle are the four beasts in Daniel 7. And that was something we realized only a week after uncovering them, when one of our staff members was able to read the accompanying Aramaic inscription identifying the first beast.

Q: Each year, you and the team uncover pieces of history that are significant to so many people for a variety of reasons. What do you hope this work does for the field and what we know of history?

A: Our work sheds light on a period when our only written sources about Judaism are rabbinic literature from the Jewish sages of this period and references in early Christian literature. The full scope of rabbinic literature is huge and diverse, but it represents the viewpoint of the group of men who wrote it. That group was fairly elite, and we don't have the writings of other groups of Jews from this period. Early Christian literature is generally hostile to Jews and Judaism. So, archaeology fills this gap by shedding light on aspects of Judaism between the fourth to sixth centuries CE - about which we would know nothing otherwise. Our discoveries indicate Judaism continued to be diverse and dynamic long after the destruction of the second Jerusalem temple in 70 CE.

Q: Now in the ninth season of digging at this site, what keeps you and the team coming back?

A: We are committed to completing the excavation of the synagogue before we turn the site over to the state of Israel, with the hope that they will develop and open it to the public in the future. In the meantime, I expect our work will continue to shed light on the past through new discoveries.

The mosaics have been removed from the site for conservation, and the excavated areas have been backfilled. Excavations are scheduled to continue in summer 2020. For additional information, images of previous discoveries and project updates, visit the Huqoq Excavation Project site here.

Credit: 
University of North Carolina at Chapel Hill