Culture

Study links financial hardship to more ED visits; less preventive care

A new American Cancer Society study finds higher medical and nonmedical financial hardships are independently associated with more emergency department visits, lower receipt of some preventive services, and worse self-rated health in cancer survivors. The authors of the study say as healthcare costs grow, unmet medical and nonmedical financial needs may worsen health disparities among cancer survivors. The study appears in the American Journal of Preventive Medicine.

More than half of Americans experience medical financial hardship at some point in their lives. Cancer survivors are particularly vulnerable, being more likely to face material (e.g., problems paying medical bills), psychological (e.g., worry about medical costs), and behavioral (e.g., delaying or forgoing care because of cost) financial hardships than individuals without a cancer history. Cancer survivors also face non-medical financial hardship, including food insecurity and worry about other economic needs (e.g., monthly bills and housing expenses), likely due to the late and lasting effects of cancer-related treatments, work limitations or inability to work, leading to reduced earnings and loss of employer-sponsored health insurance coverage. There has been little research to evaluate whether these medical and nonmedical financial hardships of cancer can impact the use of preventive services.

To learn more, investigators led by Zhiyuan "Jason" Zheng, Ph.D., looked at responses from about 12,000 cancer survivors in the National Health Interview Survey (2013-2017), stratifying survivors into 2 age groups (18-64 years and ?65 years). They found cancer survivors with higher medical and nonmedical financial hardship intensities were consistently more likely to report any emergency room visit and rated their health status worse than those with lower hardship intensities. Those with the highest level of hardship intensity also had lower levels of influenza vaccination (ages 18 to 64: 45.6% vs 52.5%; ages 65 and up: 64.6% vs 75.6%) and breast cancer screening (46.8% vs 61.2%).

"Given greater patient cost sharing and rapid development of expensive cancer treatments, the experience of medical and nonmedical financial hardship is likely to increase and may exacerbate cancer-related health disparities," write the authors.

Credit: 
American Cancer Society

Research news tip sheet: Story ideas from Johns Hopkins Medicine

image: Weekly research news from Johns Hopkins Medicine.

Image: 
Johns Hopkins Medicine

Medical Cannabis Consumers Use Less Healthcare Resources and Report Better Quality Of Life

Media Contact: Vanessa McMains, Ph.D.

Although more than 2 million people are registered in state medical cannabis programs across the United States, very little is known about the medical benefits of cannabis. What knowledge has been gained is from programs approved by the U.S. Food and Drug Administration for cannabis products to treat a type of pediatric epilepsy, anorexia in patients with AIDS, and nausea and vomiting in patients with cancer.

To determine if medical cannabis users have any perceived or actual health benefits, Johns Hopkins Medicine researchers and colleagues at Realm of Caring Foundation surveyed these users and found that they reported less pain, better sleep and reduced anxiety, along with taking fewer prescription medications.

They also were less likely to have visited an emergency room or have been admitted to a hospital than people who didn't use cannabis for medical purposes.

In their study, published online on June 8, 2020, in Cannabis and Cannabinoid Research, the researchers report that because this early work shows medical benefits for cannabis, more funding and clinical trials are urgently needed to determine what conditions the drug may treat.

"It wasn't surprising to me that people claim to feel better when using medical cannabis, but it was unexpected to see that these people utilized less health care resources," says Ryan Vandrey, Ph.D., associate professor of psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine.

"When we evaluated people before and after using medical cannabis, and then saw the exact same changes seen in the cross-sectional comparison between cannabis users and controls, that's when we knew we had a compelling validation showing actual medical benefit," he adds.

More than 800 medical cannabis users and more than 460 people not using the drug medically were surveyed in this study. From the data, the researchers found that the medical cannabis users reported about an 8% better quality of life on average, along with an approximately 9% reduction in pain scores and a 12% reduction in anxiety scores.

When it came to health care resources, compared with nonusers, medical cannabis users reported using 14% fewer prescription medications, and that they were 39% less likely to have visited an emergency room and 46% less likely to have been admitted to a hospital in the month before being surveyed.

"This study was a 30,000-foot view of the landscape and now we need to drill down to see what conditions are actually benefitted from medical cannabis use," says Vandrey.

His team next plans to look at the effects of medical cannabis on epilepsy, anxiety and autism in future studies.

Johns Hopkins Team Shows Nervous and Immune Systems 'Need to Talk' For Bone Repair

Media Contact: Michael E. Newman

In a December 2019 study, a team of Johns Hopkins Medicine researchers demonstrated in mice that repair of bone fractures requires the generation, growth and spread of nerve cells, or neurons, throughout the injured area. This, they showed, partly relies on a protein known as nerve growth factor (NGF). Now, the researchers have dug deeper into this process to better understand how the nervous and immune systems work together with NGF to enable nerve regrowth during bone repair.

In a new study, published in the May 26, 2020, issue of the journal Cell Reports, the researchers found once again in mice that two proteins -- tropomyosin receptor kinase-A (TrkA) and NGF -- bind together to stimulate innervation (the supplying of nerves), and subsequently, new bone at an injured site. What surprised them was that the NGF that mattered most in this process came from an unexpected source: macrophages, the white blood cells that alert the immune system to foreign invaders through inflammation, and then engulf and remove the attackers from the body.

"Previous research has shown that immune cells are clearly important in bone repair, but what we determined in our study is that macrophages and their inflammatory signals also kickstart nerve regrowth in injured bone," says Aaron James, M.D., Ph.D., associate professor of pathology at the Johns Hopkins University School of Medicine and co-senior author of both studies.

In other words, James explains, the team's experiments revealed "that NGF-TrkA signaling is how macrophages 'talk' to nerve fibers so that bone healing can begin."

When bones are injured, there is a large release of the NGF neurotrophin (a protein that induces the survival, development and function of neurons). This activates sensory nerves to grow into the injured tissue. These sensory nerves play multiple roles, including alerting the body through pain that the bone is broken and regulating the healing process.

To define the mechanism by which bone is repaired, the researchers removed the same small piece of skull from each of the mice in the study. By manipulating various steps of the NGF-TrkA signaling pathway in different mice, the team found that: (1) the release of NGF coincides with the beginning of innervation, (2) bone injury stimulates the increased production of NGF, (3) inflammation at the injury site drives NGF production by macrophages (which are drawn by chemical signals released during inflammation), (4) increased amounts of NGF elicit new nerve formation in the injured tissue, (5) disrupting the production of NGF reduces innervation and impairs calvarial bone regeneration, and (6) NGF produced by macrophages is the neurotrophin required for bone repair.

"We now understand that nerve growth and bone repair are linked processes," James says. "Knowing this, we may be able to find ways to maximize our innate healing capacities. Developing new methods to improve bone healing would greatly benefit many people, especially the elderly, where injuries such as hip fractures often lead to worse outcomes than heart attacks."

Intervention Reduces Cardiovascular Disease Risk in People with Serious Mental Illness

Media Contact: Vanessa McMains, Ph.D.

According to a new study by Johns Hopkins Medicine researchers, an intervention program for people with serious mental illnesses like schizophrenia, bipolar disorder and major depression can reduce the 10-year risk of having a heart attack or stroke by an average of nearly 13%. People in this group have twice the rate of cardiovascular disease than the general population and are more likely to die of it because they often have risk factors such as being overweight, getting less physical activity, having high cholesterol or blood pressure, or smoking.

Additionally, these people more commonly face special challenges in receiving heart-healthy interventions because of reduced attention or decision-making skills, and socioeconomic risk factors such as poverty, lack of social support and access to healthy foods.

The research team's findings were published on June 12, 2020, in JAMA Network Open.

"Our program, which incorporates counseling tied to care management and coordination, showed that it is possible to put together an effective program that can improve cardiovascular health and reduce the health care disparities of people with major mental illnesses," says Gail Daumit, M.D., M.H.S., professor of medicine and psychiatry and behavioral sciences at the Johns Hopkins University School of Medicine.

For this study, 132 participants with serious mental disorders underwent 18 months of intervention compared with 137 controls who did not. Each participant in the intervention program was assigned a counselor and a nurse who helped guide them to adopt positive lifestyle behaviors such as quitting smoking, eating a healthy diet and managing their cardiovascular health with primary care providers.

To calculate cardiovascular risk both before and after the intervention, the researchers used the Framingham Risk Score for Hard Coronary Heart Disease, which predicts the 10-year risk of having a major cardiovascular event, such as a heart attack or stroke.

"Although a few previous studies have examined the impact of intervening on a single risk factor, such as smoking or obesity, this investigation took the unique approach of developing and implementing a tailored, multi-level intervention to reduce risk," says Catherine Stoney, Ph.D., deputy director of the Center for Translational Research and Implementation Science at the National Heart, Lung, and Blood Institute, part of the National Institutes of Health. "The findings suggest that incorporating this comprehensive intervention in a high-need, high-risk population can have a positive impact on cardiovascular risk scores."

Daumit says a basic framework for this type of intervention already exists in a Medicaid health home program that uses nurse care managers in community mental health programs. That program is being used in 17 states and the District of Columbia. However, she says, before this study there were no evidenced-based interventions for nurses to implement that addressed cardiovascular risk in their patients.

"With a structured intervention, we believe that it would be feasible to use the existing framework to reduce cardiovascular risk in persons with serious mental illness nationwide," says Daumit.

Her future work will focus on finding ways to accomplish that goal, including developing and implementing a scaled-up version of the program validated in the study.

What Motivates People To Use Psychedelics? New Criteria May Help Provide the Answer

Media Contact: Vanessa McMains, Ph.D.

To understand why some drugs are likely to be abused, researchers in the past had identified effects on the brain and behavior shared among the most abused drugs. These common effects include feelings of euphoria that "highjack" the system leading to pleasure during eating or sex, unpleasant withdrawal when not taking the drug, and drug-seeking behavior in laboratory animals.

As a result, the U.S. Food and Drug Administration established a list of reliably documented effects to predict abuse potential and recommend what legal restrictions should be placed on new medications before they are marketed. While this approach works well for predicting the recreational use of classic drugs of abuse such as opioids, stimulants, and sedatives, it doesn't answer the question of why people use traditional, nonmedical psychedelics such as LSD or psilocybin -- the chemical found in "magic mushrooms."

Now, researchers at Johns Hopkins Medicine have proposed a solution by evaluating a new set of subjective effects that will predict the desire to use psychedelics. The researchers assessed these predictors with two psychedelics: psilocybin and dextromethorphan (DXM), a chemical found in low doses in over-the-counter cough medicine.

The team's findings are reported in the June 5, 2020, issue of the journal Psychopharmacology.

The researchers say the subjective effects they define in their study provide a "descriptive blueprint for understanding motivation for using psychedelics." The set of new features includes psychological insight, increased awareness of beauty, awe/amazement, meaningfulness and mystical-type experiences (e.g., a highly valued sense of unity), positive social effects and visual effects.

"Psilocybin currently is being evaluated as a treatment for various psychiatric conditions and there is increasing interest in developing related psychedelic drugs," says Roland Griffiths, Ph.D., the Oliver Lee McCabe III Professor in the Neuropsychopharmacology of Consciousness at the Johns Hopkins University School of Medicine and director of the Johns Hopkins Center for Psychedelic and Consciousness Research. "Before making such drugs widely available as medicines, it is important to know why people have regularly taken these drugs for hundreds of years, so we can predict factors that might lead to future problematic nonmedical use."

The 20 participants in the trial had previously used psychedelic drugs. On five separate occasions during the study, they took either three different doses of psilocybin, a large dose of DXM or a placebo. Although DXM produced greater effects than the placebo on all of the subjective predictors of future use, psilocybin consistently yielded much larger effects.

"If DXM is a fuzzy black and white TV, psilocybin is high definition color virtual reality," says Griffiths.

Credit: 
Johns Hopkins Medicine

Stroke survival rates worse in rural areas, study says

A major U.S. study reveals large gaps between urban and rural patients in quality of care received after a stroke and rates of survival. In more rural areas, the ability of hospitals to deliver advanced stroke care is lower and mortality rates substantially higher, the research shows.

The analysis, involving nearly 800,000 patients, was led by researchers at Washington University School of Medicine in St. Louis. Their findings are published June 18 in the journal Stroke.

"Our data suggest rural patients are missing out on access to more advanced stroke therapies and that action is needed to address these disparities and ensure that people can get the care they need, no matter where they live," said senior author Karen Joynt Maddox, MD, an assistant professor of medicine. "In this day and age, it's unacceptable that people don't have access to advanced care. But since stroke therapy is complex, solutions are not going to be one-size-fits-all. We need to think fundamentally differently about how we deliver stroke care in rural areas to begin reducing these disparities."

In an effort to understand the circumstances that contribute to growing differences in life expectancy between urban and rural populations, the researchers focused on stroke. It is the fifth-leading cause of death in the U.S., with 140,000 deaths annually, according to the Centers for Disease Control and Prevention. Strokes typically occur suddenly, when a blood clot interrupts the brain's blood supply. Past studies of urban-rural differences in stroke care and mortality have been limited to single medical centers, individual states or were conducted before modern advances in stroke care became available.

The researchers studied data from more than 790,000 patients nationwide who were hospitalized for stroke from 2012 through 2017. Overall, in-hospital mortality was about 6%. However, compared with patients hospitalized in urban areas, the risk of death was about 5% higher for patients hospitalized in large towns (with populations of 250,000 to 1 million people), 10% higher for patients in small towns (with populations of 50,000 to 250,000), 16% higher for patients in rural areas (with populations of 10,000 to 50,000) and 21% higher for patients in remote rural areas (with populations of less than 10,000). These gaps did not improve over the five-year span of the study.

In more rural areas, the investigators also found that patients were less likely to receive either of two of the more advanced stroke treatments available: intravenous thrombolysis, in which clot-busting drugs are injected directly into a vein to dissolve the clot; and endovascular therapy, in which a small catheter is inserted into the blocked blood vessel in the brain to remove the clot.

"It's not realistic to expect small, rural hospitals to perform some of the more advanced procedures, such as endovascular therapy," Joynt Maddox said. "But they can recognize when patients need more advanced care and transfer patients to a hospital that has those capabilities. One problem is that health systems don't have consistent, widespread procedures in place to make sure that stroke patients in rural areas have access to these technologies when they need them."

Complicating matters for rural patients, effective stroke care is highly time-dependent. The clot-busting drugs must be administered within the first three hours after the onset of symptoms, such as face drooping and slurred speech, and endovascular therapy must be given within six to eight hours after symptoms appear. After these time points, the risks of the procedures, especially internal bleeding, begin to outweigh the benefits.

Symptoms of a stroke can include face drooping, numbness or weakness on one side of the face or in one limb, slurred speech, confusion or difficulty understanding speech, vision problems, dizziness or difficulty walking, and sudden severe headache.

Although the researchers were not able to assess the amount of time between the onset of symptoms and the first treatments being administered, they suspect that rural patients may live or work farther from hospitals and have to wait longer for emergency services to respond to a 911 call than patients in urban areas.

There is also the difficulty of recognizing when a person is having a stroke.

"Patient education is another important piece of this problem because people often don't recognize the symptoms of a stroke when it's happening," said first author and cardiologist Gmerice Hammond, MD, a health policy research fellow in the Cardiovascular Division. "We can have great systems in place, but they will underperform if people wait too long to seek care because patients and family members don't recognize when someone is having a stroke."

Given the difficulty in accessing the more complex interventions for rural patients, Hammond emphasized the importance of working to prevent strokes in the first place, including better access to primary care in rural areas and other preventive measures that could help people address stroke risk factors, such as high blood pressure, smoking, diabetes and obesity.

"Simply living in a more rural setting can be a risk factor for stroke and worse outcomes after therapy," Hammond said. "Overall stroke care is improving with newer technologies, but our data show that those improvements are not reaching patients who live far from urban centers. The rural population is an important place to focus interventions and research, so we can develop ways to expand access to the latest stroke therapies and improve care for these patients."

Credit: 
Washington University School of Medicine

New research says displaying fake reviews increases consumer trust in platforms by 80%

New INFORMS Journal Information Systems Research Study Key Takeaways:

80% of users in our survey agree they trust a review platform more if it displays fake reviews.

85% of users in our survey believe they should be able to choose if they want to view truthful and fraudulent information side by side on a platform.

Our results suggest platforms should provide transparent policies to deal with fake reviews instead of censoring them without comment.

Platforms should pay attention to design and provide understandable "review quality scores" to help consumers understand the level of fraudulent activity associated with a seller.

CATONSVILLE, MD, June 18, 2019 - Many people are using COVID-19 quarantine to get projects done at home, meaning plenty of online shopping for tools and supplies. But do you buy blind? Research shows 97% of consumers consult product reviews before making a purchase. Fake reviews are a significant threat for online review portals and product search engines given the potential for damage to consumer trust. Little is known about what review portals should do with fraudulent reviews after detecting them.

New research in the INFORMS journal Information Systems Research looks at how consumers respond to potentially fraudulent reviews and how review portals can leverage this information to design better fraud management policies.

"We find consumers have more trust in the information provided by review portals that display fraudulent reviews alongside nonfraudulent reviews, as opposed to the common practice of censoring suspected fraudulent reviews," said Beibei Li of Carnegie Mellon University. "The impact of fraudulent reviews on consumers' decision-making process increases with the uncertainty in the initial evaluation of product quality."

The study, "A Tangled Web: Should Online Review Portals Display Fraudulent Reviews?"
conducted by Li alongside Michael Smith, also of Carnegie Mellon University, and Uttara Ananthakrishnan of the University of Washington, says consumers do not effectively process the content of fraudulent reviews, whether it's positive or negative. This result makes the case for incorporating fraudulent reviews and doing it in the form of a score to aid consumers' decision making.

Fraudulent reviews occur when businesses artificially inflate ratings of their own products or artificially lower the ratings of a competitor's product by generating fake reviews, either directly or through paid third parties.

"The growing interest in online product reviews for legitimate promotion has been accompanied by an increase in fraudulent reviews," continued Li. "Research shows about 15%-30% of all online reviews are estimated to be fraudulent by various media and industry reports."

Platforms don't have a common way to handle fraudulent reviews. Some delete fraudulent reviews (Google), some publicly acknowledge censoring fake reviews (Amazon), while other portals, such as Yelp, go one step further by making the fraudulent reviews visible to the public with a notation that it is potentially fraudulent.

This study used large-scale data from Yelp to conduct experiments to measure trust and found 80% of the users in our survey agree they trust a review platform more if it displays fraudulent review information because businesses are less likely to write fraud reviews on these platforms.

Meanwhile, 85% of users in our survey believe they should have a choice in viewing truthful and fraudulent information and the platforms should leave the choice to consumers to decide whether they use fraudulent review information in determining the quality of a business.

The study also finds that consumers tend to trust the information provided by platforms more when the platform distinguished and displayed fraudulent reviews from nonfraudulent reviews, as compared to the more common practice of censoring suspected fraudulent reviews.

"Our results highlight the importance of transparency over censorship and may have implications for public policy. Just as there are strong incentives to fraudulently manipulate consumer beliefs pertaining to commerce, there are also strong incentives to fraudulently manipulate individual beliefs pertaining to public policy decisions," concluded Li.

When this fraudulent activity information is made available to all consumers, platforms can effectively embed a built-in penalty for businesses that are caught writing fake reviews. A platform may admit to users that there is fraud on its site, but that is balanced by an increase in trust from consumers who already suspected that some reviews may be fraudulent and now see that something is being done to address it.

Credit: 
Institute for Operations Research and the Management Sciences

Soap bubbles pollinated a pear orchard without damaging delicate flowers

image: This photograph shows a chemically functionalized soap bubble on a campanula flower (Campanula persicifolia).

Image: 
Eijiro Miyako

Soap bubbles facilitated the pollination of a pear orchard by delivering pollen grains to targeted flowers, demonstrating that this whimsical technique can successfully pollinate fruit-bearing plants. The study, from the Japan Advanced Institute of Science and Technology in Nomi, Japan, and published June 17 in the journal iScience, suggests that soap bubbles may present a low-tech complement to robotic pollination technology designed to supplement the work of vanishing bees.

"It sounds somewhat like fantasy, but the functional soap bubble allows effective pollination and assures that the quality of fruits is the same as with conventional hand pollination," says senior author Eijiro Miyako, an associate professor in the School of Materials Science at the Japan Advanced Institute of Science and Technology. "In comparison with other types of remote pollination, functional soap bubbles have innovative potentiality and unique properties, such as effective and convenient delivery of pollen grains to targeted flowers and high flexibility to avoid damaging them."

Miyako and colleagues previously published a study in the journal Chem, in which they used a tiny toy drone to pollinate blossoming flowers (doi.org/10.1016/j.chempr.2017.01.008). But although the drone was only two centimeters long, the researchers struggled to prevent it from destroying the flowers as it bumped into them. While searching for a more flower-friendly artificial pollination technique, Miyako spent a day at the park blowing bubbles with his son. When one of the bubbles collided against his son's face--a predictably injury-free accident--Miyako found his inspiration.

After confirming through optical microscopy that soap bubbles could, in fact, carry pollen grains, Miyako and Xi Yang, his coauthor on the study, tested the effects of five commercially available surfactants on pollen activity and bubble formation. The neutralized surfactant lauramidopropyl betain (A-20AB) won out over its competitors, facilitating better pollen germination and growth of the tube that develops from each pollen grain after it is deposited on a flower. Based on a laboratory analysis of the most effective soap concentrations, the researchers tested the performance of pear pollen grains in a 0.4% A-20AB soap bubble solution with an optimized pH and added calcium and other ions to support germination. After three hours of pollination, the pollen activity mediated through the soap bubbles remained steady, while other methods such as pollination through powder or solution became less effective.

Miyako and Yang then loaded the solution into a bubble gun and released pollen-loaded bubbles into a pear orchard, finding that the technique distributed pollen grains (about 2,000 per bubble) to the flowers they targeted, producing fruit that demonstrated the pollination's success. Finally, the researchers loaded an autonomous, GPS-controlled drone with functionalized soap bubbles, which they used to direct soap bubbles at fake lilies (since flowers were no longer in bloom) from a height of two meters, hitting their targets at a 90% success rate when the machine moved at a velocity of two meters per second.

Although this approach to pollination appears promising, more techniques are still needed to improve its precision. Plus, with soap bubbles, weather is key - raindrops can wash away pollen-bearing bubbles from flowers, while strong winds might blow them astray.

Next, Miyako and colleagues plan to tackle the issue of waste generated by the artificial pollinator prototype, since most bubbles still fail to land on their target flowers. "I believe that further innovative technologies, such as state-of-the-art localization and mapping, visual perception, path planning, motion control, and manipulation techniques would be essential for developing autonomous precision robotic pollination on a large scale," says Miyako.

Credit: 
Cell Press

Custom-built to ready-made

Information technology continues to progress at a rapid pace. However, the growing demands of data centers have pushed electrical input-output systems to their physical limit, which has created a bottleneck. Maintaining this growth will require a shift in how we built computers. The future is optical.

Over the last decade, photonics has provided a solution to the chip-to-chip bandwidth problem in the electronic world by increasing the link distance between servers with higher bandwidth, far less energy and lower latency compared to electrical interconnects.

One element of this revolution, silicon photonics, was advanced 15 years ago by the demonstration from UC Santa Barbara and Intel of a silicon laser technology. This has since triggered an explosion of this field. Intel is now delivering millions of silicon photonic transceivers for data centers all around the world.

A new discovery in silicon photonics by a collaboration of UC Santa Barbara, Caltech and the Swiss Federal Institute of Technology Lausanne (EPFL) reveals another revolution in this field. The group managed to simplify and condense a complex optical system onto a single silicon photonic chip. The achievement, featured in Nature, significantly lowers the cost of production and allows for easy integration with traditional, silicon chip production.

"The entire internet is driven by photonics now," said John Bowers, who holds the Fred Kavli Chair in Nanotechnology at UC Santa Barbara, directs the campus's Institute for Energy Efficiency and led the collaborative research effort.

Despite the great success of photonics in the Internet backbone, challenges still exist. The explosion of data traffic puts a growing requirement on the data rate each individual silicon photonic chip can handle. Using multicolor laser light to transmit information is the most efficient way to address this demand. The more laser colors, the more information that can be carried.

However, this poses a problem for integrated lasers, which can generate only one color of laser light at a time. "You might literally need 50 or more lasers in that chip for that purpose" said Bowers. And using 50 lasers has a number of drawbacks. It's expensive, and rather inefficient in terms of power. What's more, the frequency of light each laser produces can fluctuate slightly due to noise and heat. With multiple lasers, the frequencies can even drift into each other, much like early radio stations did.

A technology called "optical frequency combs" provide a promising solution to address this problem. It refers to a collection of equally spaced frequencies of laser light. Plotting the frequencies reveals spikes and dips that resemble a hair comb --hence the name. However, generating combs required bulky, expensive equipment. Using an integrated photonics approach, Bowers' team has demonstrated the smallest comb generator in the world, which resolves all of these problems.

The configuration of the system is rather simple, consisting of a commercially distributed feedback laser and a silicon nitride photonic chip. "What we have is a source that generates all these colors out of one laser and one chip. That's what's significant about this," Bowers said.

The simple structure leads to a significant reduction of scale, power and cost. The whole setup now fits in a package smaller than a match box, whose overall price and power consumption are smaller than previous systems.

What's more, the new technology is also much more convenient to operate. Previously, generating a stable comb had been a tricky endeavor. Researchers had to modulate the frequency and adjust power just right to produce a coherent comb state, called soliton. That process was not guaranteed to generate such state every time. "The new approach makes the process as easy as switching on a room light," said coauthor Kerry Vahala, a professor of applied physics and information science and technology at Caltech.

"What is remarkable about the result is the reproducibility with which frequency combs can be generated on demand," added Tobias J. Kippenberg, professor of physics at EPFL who provided the low loss silicon nitride photonics chips, a technology already commercialized via LIGENTEC. "This process used to require elaborate control in the past".

The magic behind all these improvements lies in an interesting physical phenomenon. When the pump laser and resonator are integrated, the interaction between them forms a highly coupled system that is self-injection locking and simultaneously generates "solitons", pulses that circulate indefinitely inside the resonator and give rise to optical frequency combs. "Such interaction is the key to directly generating the comb and operating it in the soliton state" explained coauthor Lin Chang, a postdoctoral researcher in Bowers' lab.

This new technology will have a big impact on photonics. In addition to addressing the demands of multicolor light sources in communication related products, it also opens up a lot of new opportunities in many applications. One example is optical clocks, which provide the most accurate time standard in the world and have many uses -- from navigation in daily life to measurements of physical constants.

"Optical clocks used to be large, heavy and expensive," Bowers noted, "and there are only a few in the world. With integrated photonics, we can make something that could fit in a wristwatch, and you could afford it. Low noise integrated optical microcombs will enable a new generation of optical clocks, communications and sensors. We should see more compact, more sensitive GPS receivers coming out of this approach."

All in all, the future looks bright for photonics. "It is the key step to transfer the frequency comb technology from the laboratory to the real world." Bowers said. "It will change photonics and our daily lives."

Credit: 
University of California - Santa Barbara

First-degree incest: ancient genomes uncover Irish passage tomb dynastic elite

image: Newgrange as seen on a misty morning.

Image: 
Ken Williams, shadowsandstone.com

> The genome of an adult male from the heart of the world famous Newgrange passage tomb points to first-degree incest, implying dynasty and echoing local place-name folklore first recorded in Medieval times

> Far-flung kinship ties between Newgrange and passage tomb cemeteries in the west (Carrowkeel and Carrowmore, Co. Sligo) indicate an elite social stratum was widespread

> Before megalith builders arrived en masse, Ireland was home to a small hunter-gatherer population, whose genomes speak of long-term isolation from Britain and Europe

> The earliest case of Down Syndrome was discovered in a male infant from the famous Poulnabrone portal tomb

Archaeologists and geneticists, led by those from Trinity College Dublin, have shed new light on the earliest periods of Ireland's human history.

Among their incredible findings is the discovery that the genome of an adult male buried in the heart of the Newgrange passage tomb points to first-degree incest, implying he was among a ruling social elite akin to the similarly inbred Inca god-kings and Egyptian pharaohs.

Older than the pyramids, Newgrange passage tomb in Ireland is world famous for its annual solar alignment where the winter solstice sunrise illuminates its sacred inner chamber in a golden blast of light. However, little is known about who was interred in the heart of this imposing 200,000 tonne monument or of the Neolithic society which built it over 5,000 years ago.

The survey of ancient Irish genomes, published today in leading international journal, Nature, suggests a man who had been buried in this chamber belonged to a dynastic elite. The research, led by the research team from Trinity, was carried out in collaboration with colleagues from University College London, National University of Ireland Galway, University College Cork, University of Cambridge, Queen's University Belfast, and Institute of Technology Sligo.

"I'd never seen anything like it," said Dr Lara Cassidy, Trinity, first author of the paper. "We all inherit two copies of the genome, one from our mother and one from our father; well, this individual's copies were extremely similar, a tell-tale sign of close inbreeding. In fact, our analyses allowed us to confirm that his parents were first-degree relatives."

Matings of this type (e.g. brother-sister unions) are a near universal taboo for entwined cultural and biological reasons. The only confirmed social acceptances of first-degree incest are found among the elites - typically within a deified royal family. By breaking the rules, the elite separates itself from the general population, intensifying hierarchy and legitimizing power. Public ritual and extravagant monumental architecture often co-occur with dynastic incest, to achieve the same ends.

"Here the auspicious location of the male skeletal remains is matched by the unprecedented nature of his ancient genome," said Professor of Population Genetics at Trinity, Dan Bradley. "The prestige of the burial makes this very likely a socially sanctioned union and speaks of a hierarchy so extreme that the only partners worthy of the elite were family members."

The team also unearthed a web of distant familial relations between this man and other individuals from sites of the passage tomb tradition across the country, including the mega-cemeteries of Carrowmore and Carrowkeel in Co. Sligo.

"It seems what we have here is a powerful extended kin-group, who had access to elite burial sites in many regions of the island for at least half a millennium," added Dr Cassidy.

Remarkably, a local myth resonates with these results and the Newgrange solar phenomenon. First recorded in the 11th century AD, four millennia after construction, the story tells of a builder-king who restarted the daily solar cycle by sleeping with his sister. The Middle Irish place name for the neighbouring Dowth passage tomb, Fertae Chuile, is based on this lore and can be translated as 'Hill of Sin'.

"Given the world-famous solstice alignments of Brú na Bóinne, the magical solar manipulations in this myth already had scholars questioning how long an oral tradition could survive," said Dr Ros Ó Maoldúin, an archaeologist on the study. "To now discover a potential prehistoric precedent for the incestuous aspect is extraordinary."

The genome survey stretched over two millennia and unearthed other unexpected results. Within the oldest known burial structure on the island, Poulnabrone portal tomb, the earliest yet diagnosed case of Down Syndrome was discovered in a male infant who was buried there five and a half thousand years ago. Isotope analyses of this infant showed a dietary signature of breastfeeding. In combination, this provides an indication that visible difference was not a barrier to prestige burial in the deep past.

Additionally, the analyses showed that the monument builders were early farmers who migrated to Ireland and replaced the hunter-gatherers who preceded them. However, this replacement was not absolute; a single western Irish individual was found to have an Irish hunter-gatherer in his recent family tree, pointing toward a swamping of the earlier population rather than an extermination.

Genomes from the rare remains of Irish hunter-gatherers themselves showed they were most closely related to the hunter-gatherer populations from Britain (e.g. Cheddar Man) and mainland Europe. However, unlike British samples, these earliest Irelanders had the genetic imprint of a prolonged island isolation. This fits with what we know about prehistoric sea levels after the Ice Age: Britain maintained a land bridge to the continent long after the retreat of the glaciers, while Ireland was separated by sea and its small early populations must have arrived in primitive boats.

Credit: 
Trinity College Dublin

CAR T cells beyond cancer: Targeting senescence-related diseases

Chimeric antigen receptor (CAR) T cells have transformed the treatment of refractory blood cancers. These genetically engineered immune cells seek out and destroy cancer cells with precision. Now, scientists at Memorial Sloan Kettering are deploying them against other diseases, including those caused by senescence, a chronic "alarm state" in tissues. The scope of such ailments is vast and includes debilitating conditions, such as fibrotic liver disease, atherosclerosis, and diabetes.

Key to the success of CAR T cell therapy has been finding a good target. The first US Food and Drug Administration-approved CAR T cells target a molecule on the surface of blood cancers called CD19. It is present on cancer cells but few other normal cells, so side effects are limited.

Taking their cue from this prior work, a team of investigators including Scott Lowe, Chair of the Cancer Biology and Genetics Program in the Sloan Kettering Institute, and Michel Sadelain, Director of the Center for Cell Engineering at MSK, along with their trainees Corina Amor, Judith Feucht, and Josef Leibold, sought to identify a target on senescent cells. These cells no longer divide, but they actively send "help me" signals to the immune system.

"Senescence is a double-edge sword," says Dr. Lowe, a co-responding author on a new paper published June 17 in Nature. "Cells in this state play an important role in wound healing and cancer deterrence. But if they linger for too long, they can cause chronic inflammation, which itself is a cause of many diseases. Finding a way to safely eliminate these cells would be a major therapeutic breakthrough in the treatment of these diseases."

By comparing molecules on the surface of senescent cells to other cell types, the MSK scientists were able to identify a molecule -- urokinase plasminogen activator receptor (uPAR) -- that is enriched on these cells and mostly absent on others.

Then, they designed CAR T cells that recognize uPAR and tested them in several different mouse models of senescence-related diseases, including cancer and liver fibrosis. Fibrosis is a damaging process in which healthy tissue is gradually replaced by scar tissue and is a major cause of liver disease.

The engineered cells worked beautifully. They successfully eliminated senescent cells from two different mouse models of liver fibrosis. As well, the CAR T cells improved survival in mouse models of lung cancer when given along with drugs previously shown to induce senescence in this cancer type.

The team's next step will be to determine whether the uPAR-directed CAR T cells can effectively combat other senescence-related diseases, including atherosclerosis, diabetes, and osteoarthritis. Eventually, they hope to develop the cells for clinical use in people.

"This study demonstrates that T cell engineering and CAR therapy can be effective beyond cancer immunotherapy," says Dr. Sadelain, whose lab pioneered the first effective CAR T cells against cancer.

"We think this approach has the potential to tackle a number of senescence-related diseases for which new treatments are badly needed," Dr. Lowe adds.

Credit: 
Memorial Sloan Kettering Cancer Center

First dinosaur eggs were soft like a turtle's

image: The clutch of fossilized Protoceratops eggs and embryos examined in this study was discovered in the Gobi Desert of Mongolia at Ukhaa Tolgod.

Image: 
M. Ellison/©AMNH

New research suggests that the first dinosaurs laid soft-shelled eggs--a finding that contradicts established thought. The study, led by the American Museum of Natural History and Yale University and published today in the journal Nature, applied a suite of sophisticated geochemical methods to analyze the eggs of two vastly different non-avian dinosaurs and found that they resembled those of turtles in their microstructure, composition, and mechanical properties. The research also suggests that hard-shelled eggs evolved at least three times independently in the dinosaur family tree.

"The assumption has always been that the ancestral dinosaur egg was hard-shelled," said lead author Mark Norell, chair and Macaulay Curator in the Museum's Division of Paleontology. "Over the last 20 years, we've found dinosaur eggs around the world. But for the most part, they only represent three groups--theropod dinosaurs, which includes modern birds, advanced hadrosaurs like the duck-bill dinosaurs, and advanced sauropods, the long-necked dinosaurs. At the same time, we've found thousands of skeletal remains of ceratopsian dinosaurs, but almost none of their eggs. So why weren't their eggs preserved? My guess--and what we ended up proving through this study--is that they were soft-shelled."

Amniotes--the group that includes birds, mammals, and reptiles--produce eggs with an inner membrane or "amnion" that helps to prevent the embryo from drying out. Some amniotes, such as many turtles, lizards, and snakes, lay soft-shelled eggs, whereas others, such as birds, lay eggs with hard, heavily calcified shells. The evolution of these calcified eggs, which offer increased protection against environmental stress, represents a milestone in the history of the amniotes, as it likely contributed to reproductive success and so the spread and diversification of this group. Soft-shelled eggs rarely preserve in the fossil record, which makes it difficult to study the transition from soft to hard shells. Because modern crocodilians and birds, which are living dinosaur, lay hard-shelled eggs, this eggshell type has been inferred for all non-avian dinosaurs.

The researchers studied embryo-containing fossil eggs belonging to two species of dinosaur: Protoceratops, a sheep-sized plant-eating dinosaur that lived in what is now Mongolia between about 75 and 71 million years ago, and Mussaurus, a long-necked, plant-eating dinosaur that grew to 20 feet in length and lived between 227 and 208.5 million years ago in what is now Argentina. The exceptionally preserved Protoceratops specimen includes a clutch of at least 12 eggs and embryos, six of which preserve nearly complete skeletons. Associated with most of these embryos--which have their backbones and limbs flexed--consistent with the position the animals would assume while growing inside of the egg--is a diffuse black-and-white egg-shaped halo that obscures some of the skeleton. In contrast, two potentially hatched Protoceratops newborns in the specimen are largely free of the mineral halos. When they took a closer look at these halos with a petrographic microscope and chemically characterized the egg samples with high-resolution in situ Raman microspectroscopy, the researchers found chemically altered residues of the proteinaceous eggshell membrane that makes up the innermost eggshell layer of all modern archosaur eggshells. The same was true for the Mussaurus specimen. And when they compared the molecular biomineralization signature of the dinosaur eggs with eggshell data from other animals, including lizards, crocodiles, birds, and turtles, they determined that the Protoceratops and Mussaurus eggs were indeed non-biomineralized--and, therefore, leathery and soft.

"It's an exceptional claim, so we need exceptional data," said study author and Yale graduate student Jasmina Wiemann. "We had to come up with a brand-new proxy to be sure that what we were seeing was how the eggs were in life, and not just a result of some strange fossilization effect. We now have a new method that can be applied to all other sorts of questions, as well as unambiguous evidence that complements the morphological and histological case for soft-shelled eggs in these animals."

With data on the chemical composition and mechanical properties of eggshells from 112 other extinct and living relatives, the researchers then constructed a "supertree" to track the evolution of the eggshell structure and properties through time, finding that hard-shelled, calcified eggs evolved independently at least three times in dinosaurs, and probably developed from an ancestrally soft-shelled type.

"From an evolutionary perspective, this makes much more sense than previous hypotheses, since we've known for a while that the ancestral egg of all amniotes was soft," said study author and Yale graduate student Matteo Fabbri. "From our study, we can also now say that the earliest archosaurs--the group that includes dinosaurs, crocodiles, and pterosaurs--had soft eggs. Up to this point, people just got stuck using the extant archosaurs--crocodiles and birds--to understand dinosaurs."

Because soft eggshells are more sensitive to water loss and offer little protection against mechanical stressors, such as a brooding parent, the researchers propose that they were probably buried in moist soil or sand and then incubated with heat from decomposing plant matter, similar to some reptile eggs today.

Credit: 
American Museum of Natural History

Knock-knock? Who's there? How coral let symbiotic algae in

video: Building on Carnegie biologists' long-standing tradition of model organism development, Department of Embryology researchers set out to use the pulsing, feathery, lavender-colored soft coral Xenia to reveal the cell types and pathways that orchestrate the symbiotic relationship between a coral and its preferred species of algae. This knowledge can be applied to increase our understanding of other coral species and allow for further research into how these fragile ecosystems are threatened by warming oceans.

Image: 
Carnegie Embryology

Baltimore, MD-- New work from a team of Carnegie cell, genomic and developmental biologists solves a longstanding marine science mystery that could aid coral conservation. The researchers identified the type of cell that enables a soft coral to recognize and take up the photosynthetic algae with which it maintains a symbiotic relationship, as well as the genes responsible for this transaction.

Their breakthrough research is published in Nature.

Corals are marine invertebrates that build large exoskeletons from which reefs are constructed. But this architecture is only possible because of a mutually beneficial relationship between the coral and various species of single-celled algae called dinoflagellates that live inside individual coral cells.

"For years, researchers have been trying to determine the mechanism by which the coral host is able to recognize the algal species with which is compatible--as well as to reject other, less-desirable species--and then to ingest and maintain them in a symbiotic arrangement," explained Carnegie Embryology Director Yixian Zheng who, with colleagues Chen-ming Fan, Minjie Hu, and Xiaobin Zheng, conducted this research.

These dinoflagellates are photosynthetic, which means that, like plants, they can convert the Sun's energy into chemical energy. An alga will share the sugars it synthesizes with its coral host, which in turn provides the alga with the inorganic carbon building blocks it needs, as well as phosphorous, nitrate, and sulfur.

"However, ocean warming due to climate change is causing many coral hosts to lose their algal tenants--along with the nutrients that they provide--a phenomenon called bleaching," explained lead author Hu. "Without algae there to increase its food supply, the coral can die. This makes it particularly critical to understand the symbiotic mechanism now, as coral communities are increasingly jeopardized."

Building on Carnegie biologists' longstanding tradition of using a model organism approach to study complicated biological processes, the research team set out to use the pulsing, feathery, lavender-colored, soft coral Xenia to reveal the cell types and pathways that orchestrate the symbiotic relationship between a coral and its algae. This knowledge can then be applied to increase our understanding of other coral species and allow for further research into how these fragile ecosystems are threatened by warming oceans.

Applying a wide range of genomic, bioinformatic, and developmental biology tools, the researchers identified the type of cell that is required for the symbiotic relationship to occur. They discovered that it expresses a distinct set of genes, which enable it to identify, "swallow," and maintain an alga in a specialized compartment, as well as to prevent the alga from being attacked by its immune system as a foreign invader. Furthermore, the researchers showed that the uptake process occurs over five stages, with stage three representing mature, alga-hosting cells, and stage one being pre-symbiotic-relationship and stage five being post-alga-expulsion.

Looking ahead, the team wants to understand how environmental stress affects progression through the five stages, and which stage is most crucial for recovery after a bleaching event, and the genes that function at each stage.

Earlier this year, Zheng was selected as one of 15 scientists awarded a grant from the Gordon and Betty Moore Foundation to support research on symbiosis in aquatic systems. The foundation launched its Symbiosis in Aquatic Systems Initiative last year. Current and emerging leaders in aquatic symbiosis research--as well as scientists who will apply their deep expertise from other areas of science to aquatic symbiosis--were selected from a competitive pool.

"Dr. Zheng's work using the soft coral, Xenia, is an exemplar of how model systems research can advance our understanding of fundamental processes in nature. We look forward to continued discovery as part of her Symbiosis in Aquatic Systems Initiative investigator award," Said Dr. Sara J. Bender, program officer, Science Program, Gordon and Betty Moore Foundation.

Credit: 
Carnegie Institution for Science

Juicy genomics

image: In this new study, researchers sequenced and compared the genomes of 100 tomato varieties to reveal what influences traits such as flavor and size.

Image: 
Zachary Lippman

When Pulitzer Prize and Grammy award winner Kendrick Lamar rapped "I got millions, I got riches buildin' in my DNA," he almost certainly wasn't talking about the humble tomato. But a new study unveiling more than 230,000 DNA differences across 100 tomato varieties which will allow breeders and scientists to engineer larger, juicier, more profitable plants, proves that tomatoes indeed have riches buildin' in their DNA, too.

The study will be published June 17 in Cell.

"The vast majority of the DNA differences we discovered are completely new," says Michael Schatz, Bloomberg Distinguished Associate Professor of Computer Science and Biology at Johns Hopkins University and the study's co-corresponding author.

As one of the largest fruit crops in the world, the commercially growing tomatoes is an $190 billion global industry that relies pinpointing which large-scale difference between genomes, or structural variants, are responsible for the variety of tomato shapes, colors and tastes we see at the store.

Previous technologies, however, didn't allowed scientists to read large portions of a genome, only allowing for small bits to be read at a time in a piecemeal fashion.

"Like a big jigsaw puzzle with hundreds of millions of small pieces, maybe you manage to put together the corners, but not the big blue sky. The new technology used in this study allowed us to zoom in and get larger, clearer puzzle pieces," says Schatz.

Using new DNA sequencing technology and software to 'sharpen' their view, Schatz and more than 30 collaborators around the world in a self-proclaimed "tomato consortium" were able to sequence and compare the genomes of 100 different tomato types. In doing so, they found more than 230,000 structural variants.

From there, the team dove deeper with detailed genetic experiments to understand how some of those variants affect tomato traits. In one experiment, they found that duplication of a particular gene causes a plant's tomatoes to be about 20% larger. Next, they discovered a gene that contributes to a smoky flavor in some tomatoes. And in a third set of experiments, the researchers uncovered a complex interaction involving four structural variants that can mitigate a potential trade-off between a feature that simplifies tomato harvesting and another that reduces productivity.

The scale of their investigation has never been accomplished for any other crop, says Cold Spring Harbor Laboratory professor and Howard Hughes Medical Institute investigator Zachary Lippman, who co-led the project.

"I think it sets the foundation for what other crops and people in those working on those crops should be thinking about," says Lippman.

"All crops are based on mutations. Everything that we eat is based on mutations and up until now it's pretty been pretty slow process to identify and evaluate the importance of those mutations."

Adds Schatz: "We've taken processes that used to take hundreds, or in some cases, even thousands of years, and performed them very rapidly. From here, we can apply our understanding of genetics to very rapidly domesticate some of the species related to tomatoes and create new crops to feed the world with."

Credit: 
Johns Hopkins University

The DNA tricks that gave us 100 different kinds of tomatoes

image: Tomatoes come in many sizes, colors, and flavors. CSHL Professor Zach Lippman, JHU Professor Mike Schatz, and colleagues around the world described the genetic underpinnings of 100 different types of tomatoes, including those in this photograph.

Image: 
Lippman lab/CSHL, 2020

An expansive new analysis of genetic variation among tomatoes has uncovered 230,000 previously hidden large-scale differences in DNA between varieties. As tomato plants evolved, segments of DNA were deleted, duplicated, or rearranged. These genomic "structural variations" underpin the vast diversity among tomatoes, changing flavors, altering yield, and shaping other important traits.

The study, a collaborative effort led by Cold Spring Harbor Laboratory Professor and Howard Hughes Medical Institute investigator Zachary Lippman and Johns Hopkins University (JHU) Professor Michael Schatz, is the most comprehensive analysis to date of structural genome variation for a major crop. Breeders and scientists will be able to apply the information to breed or engineer new, more desirable plants with greater efficiency.

Large-scale differences between genomes, known collectively as structural variants, are likely responsible for a wide range of plant features that breeders care about, but these elements have been notoriously difficult to study, leaving much of the genetic origins of tomato diversity unexplained, says Xingang Wang, a postdoctoral researcher in Lippman's lab. New DNA sequencing technology along with powerful new genome editing technology has recently made structural variants easier to detect and study how they affect crop traits. Lippman's team, in collaboration with scientists at Johns Hopkins University, the University of Georgia, the Boyce Thompson Institute, and others, seized the opportunity to investigate. Lippman notes,

"There was a whole massive amount of natural genetic variation that we were blind to. And the only way to get at it was through this new technology. And there were already quite a few examples in the literature of how some of that hidden--what we call structural variation--was important. And it was probably being grossly underestimated in terms of its importance. So we really just needed to walk through that door. And the only way to do it was to do it at scale with a hundred different genomes."

Together, the group sequenced and compared the genomes of 100 different varieties of tomato, including robust varieties suitable for industrial agriculture, succulent heirlooms, and wild relatives of cultivated tomatoes. Within those genomes, the team identified more than 230,000 structural variants.

To gain a better understanding of structural variants' role in diversity, the team showed that thousands of genes were changed by the structural changes. Then they used CRISPR--the genome editing tool that can make targeted changes in DNA--to show that duplication of a particular gene causes a plant's tomatoes to increase in size by about 30 percent. According to Schatz:

"Previous studies had identified this one gene called KLUH as being relevant. It makes the plant smaller. So it's the HULK, y'know superhero, spelled backwards. Previous studies had sort of thought they had identified a single nucleotide variant. But we really came to appreciate, it was not a single nucleotide variant--it was one of these structural variants that had never been detected before. And what it turns out is that there was a so-called copy number variant, where if you have extra copies of this KLUH gene, it tends to make all of the cells in the plants bigger, makes the fruits bigger. So we think that this is going to have an enormous impact in agricultural science, where we sort of did this in tomato, but the experimental design could be executed in basically every crop species of interest."

Investigating another variant, they tracked down a gene that contributes to a smoky flavor in some tomatoes. Wang says:

"By identifying the causal gene and the causal mutations of this flavor, in the future, breeders can have a more precise target to either increase the smoky flavor or try to remove the smoky flavor."

And in a third set of experiments, the researchers used CRISPR to tease out a complex interaction involving four structural variants that can mitigate a potential trade-off between a feature that simplifies tomato harvesting and another that reduces productivity.

Understanding how structural variants influence tomatoes, a $190 billion global industry, gives breeders new power to improve the properties of tomatoes, and shows how structural variants that can enhance breeding are likely hidden in the complex genomes of many other important crops, like corn, rice, and soybeans.

Credit: 
Cold Spring Harbor Laboratory

Examining association between common antibiotic use, risk of cardiovascular death

What The Study Did: This observational study examined the risk of cardiovascular death and sudden cardiac death associated with use of the antibiotic azithromycin compared with amoxicillin.

Authors: Jonathan G. Zaroff, M.D., of Kaiser Permanente Northern California in Oakland, is the corresponding author.

To access the embargoed study: Visit our For The Media website at this link https://media.jamanetwork.com/

(doi:10.1001/jamanetworkopen.2020.8199)

Editor's Note: The article includes conflicts of interest and funding/support disclosures. Please see the article for additional information, including other authors, author contributions and affiliations, conflicts of interest and financial disclosures, and funding and support.

Credit: 
JAMA Network

First egg from Antarctica is big and might belong to an extinct sea lizard

image: An artist's interpretation of a baby mosasaur emerging from an egg just moments after it was laid. The scene is set in the shallow waters of Late Cretaceous Antarctica. In the background, mountains are covered in vegetation due to a warm climate. In the upper right, an alternative hypothesis for egg laying is depicted, with the mosasaur laying an egg on the beach.

Image: 
John Maisano/The University of Texas at Austin Jackson School of Geosciences

In 2011, Chilean scientists discovered a mysterious fossil in Antarctica that looked like a deflated football. For nearly a decade, the specimen sat unlabeled and unstudied in the collections of Chile's National Museum of Natural History, with scientists identifying it only by its sci-fi movie-inspired nickname - "The Thing."

An analysis led by researchers at The University of Texas at Austin has found that the fossil is a giant, soft-shell egg from about 66 million years ago. Measuring in at more than 11 by 7 inches, the egg is the largest soft-shell egg ever discovered and the second-largest egg of any known animal.

The specimen is the first fossil egg found in Antarctica and pushes the limits of how big scientists thought soft-shell eggs could grow. Aside from its astounding size, the fossil is significant because scientists think it was laid by an extinct, giant marine reptile, such as a mosasaur -- a discovery that challenges the prevailing thought that such creatures did not lay eggs.

"It is from an animal the size of a large dinosaur, but it is completely unlike a dinosaur egg," said lead author Lucas Legendre, a postdoctoral researcher at UT Austin's Jackson School of Geosciences. "It is most similar to the eggs of lizards and snakes, but it is from a truly giant relative of these animals."

A study describing the fossil egg was published in Nature on June 17.

Co-author David Rubilar-Rogers of Chile's National Museum of Natural History was one of the scientists who discovered the fossil in 2011. He showed it to every geologist who came to the museum, hoping somebody had an idea, but he didn't find anyone until Julia Clarke, a professor in the Jackson School's Department of Geological Sciences, visited in 2018.

"I showed it to her and, after a few minutes, Julia told me it could be a deflated egg!" Rubilar-Rogers said.

Using a suite of microscopes to study samples, Legendre found several layers of membrane that confirmed that the fossil was indeed an egg. The structure is very similar to transparent, quick-hatching eggs laid by some snakes and lizards today, he said. However, because the fossil egg is hatched and contains no skeleton, Legendre had to use other means to zero in on the type of reptile that laid it.

He compiled a data set to compare the body size of 259 living reptiles to the size of their eggs, and he found that the reptile that laid the egg would have been more than 20 feet long from the tip of its snout to the end of its body, not counting a tail. In both size and living reptile relations, an ancient marine reptile fits the bill.

Adding to that evidence, the rock formation where the egg was discovered also hosts skeletons from baby mosasaurs and plesiosaurs, along with adult specimens.

"Many authors have hypothesized that this was sort of a nursery site with shallow protected water, a cove environment where the young ones would have had a quiet setting to grow up," Legendre said.

The paper does not discuss how the ancient reptile might have laid the eggs. But the researchers have two competing ideas.

One involves the egg hatching in the open water, which is how some species of sea snakes give birth. The other involves the reptile depositing the eggs on a beach and hatchlings scuttling into the ocean like baby sea turtles. The researchers say that this approach would depend on some fancy maneuvering by the mother because giant marine reptiles were too heavy to support their body weight on land. Laying the eggs would require the reptile to wriggle its tail on shore while staying mostly submerged, and supported, by water.

"We can't exclude the idea that they shoved their tail end up on shore because nothing like this has ever been discovered," Clarke said.

Credit: 
University of Texas at Austin

Photonics: From custom-built to ready-made

image: Compact silicon-nitride integrated soliton microcomb chip device in a butterfly package with a fiber output.

Image: 
Lin Chang (UCSB)

Information technology continues to progress at a rapid pace. However, the growing demands of data centers have pushed electrical input-output systems to their physical limit, which has created a bottleneck. Maintaining this growth will require a shift in how we built computers. The future is optical.

Over the last decade, the field of photonics has provided a solution to the chip-to-chip bandwidth problem in the electronic world by increasing the link distance between servers with higher bandwidth, far less energy, and lower latency compared to electrical interconnects.

One element of this revolution, silicon photonics, was advanced fifteen years ago when UC Santa Barbara and Intel demonstrated silicon laser technology. This has since triggered an explosion of this field. Intel is now delivering millions of silicon photonic transceivers for data centers all around the world.

Now, a collaboration between UC Santa Barbara, Caltech, and EPFL have made another revolutionary discovery in the field. The group managed to simplify and condense a complex optical system onto a single silicon photonic chip. The achievement, published in Nature, significantly lowers the cost of production and allows for easy integration with traditional, silicon chip production.

"The entire internet is driven by photonics now," says John Bowers, who holds the Fred Kavli Chair in Nanotechnology at UC Santa Barbara and directs the campus's Institute for Energy Efficiency and led the collaborative research effort.

Despite the great success of photonics in the internet's backbone, there are still challenges. The explosion of data traffic also means growing requirements for the data rates that silicon photonic chip can handle. So far, most efficient way to address this demand is to use multicolor laser lights to transmit information: the more laser colors, the more information can be carried.

But this poses a problem for integrated lasers, which can generate only one color of laser light at a time. "You might literally need fifty or more lasers in that chip for that purpose," says Bowers. And using fifty lasers is expensive and inefficient in terms of power. Also, noise and heat can cause the frequency of light that each laser produces to fluctuate. Finally, with multiple lasers, the frequencies can even drift into each other, much like early radio stations did.

A solution can be found in the technology of "optical frequency combs", which are collections of equally spaced frequencies of laser light. Plotting the frequencies reveals spikes and dips that resemble a hair comb -- hence the name.

Generating combs used to require bulky and expensive equipment, but this can be now managed using the recently emerged microresonator-based soliton frequency combs, which are miniaturized frequency comb sources built on CMOS photonic chips. Using this "integrated photonics" approach, the collaborating team has developed the smallest comb generator in the world, which essentially resolves all of these issues.

The system is rather simple, consisting of a commercially available feedback laser and a silicon nitride photonic chip. "What we have is a source that generates all these colors out of one laser and one chip," says Bowers. "That's what's significant about this."

The simple structure means small scale, less power, and lower cost. The entire setup now fits in a package smaller than a match box whose overall price and power consumption are smaller than previous systems.

The new technology is also much more convenient to operate. Previously, generating a stable comb had been a tricky endeavor. Researchers would have to adjust frequency and power just right to produce a coherent soliton comb, and even then, the process was not guaranteed to generate a comb every time. "The new approach makes the process as easy as switching on a room light," says Kerry Vahala, Professor of Applied Physics and Information Science and Technology at Caltech, where the new soliton generation scheme was discovered.

"What is remarkable about the result is the full photonic integration and reproducibility with which frequency combs can be generated on demand," adds Tobias J. Kippenberg, Professor of Physics at EPFL who leads the Laboratory and Photonics and Quantum Measurement (LPQM), and whose laboratory first observed microcombs more than a decade ago.

The EPFL team has provided the ultralow-loss silicon nitride photonic chips, which were fabricated in at EPFL Center of MicroNanoTechnology (CMi) and serve as the key component for soliton comb generation. The low-loss silicon nitride photonics technology has been commercialized via the lab startup LIGENTEC.

The "magic" behind all these improvements lies in an interesting physical phenomenon: when the pump laser and resonator are integrated, their interaction forms a highly coupled system that is self-injection-locking and simultaneously generates "solitons" - pulses that circulate indefinitely inside the resonator and give rise to optical frequency combs.

The new technology is expected to have an extensive impact on photonics. In addition to addressing the demands of multicolor light sources in communication-related products, it also opens up a lot of new opportunities in many applications. One example is optical clocks, which provide the most accurate time standard in the world and are used in a number of applications, from navigation to measuring physical constants.

"Optical clocks used to be large, heavy, and expensive," says Bowers. "There are only a few in the world. With integrated photonics, we can make something that could fit in a wristwatch, and you could afford it."

"Low-noise integrated optical microcombs will enable a new generation of optical clocks, communications and sensors," says Gordon Keeler, the project's manager at the Defense Advanced Research Projects Agency (DARPA). "We should see more compact, more sensitive GPS receivers coming out of this approach."

All in all, the future looks bright for photonics. "It is the key step to transfer the frequency comb technology from the laboratory to the real world," says Bowers. "It will change photonics and our daily lives."

Credit: 
Ecole Polytechnique Fédérale de Lausanne