Tech

NASA catches Tropical Cyclone Gelena's post-tropical transition

image: On Feb 15, 2019 at 10 p.m. EDT (Feb. 16, 2019 at 0300 UTC), the MODIS instrument aboard NASA's Aqua satellite provided a visible image of Gelena that showed the storm had transitioned into a post-tropical cyclone.

Image: 
NASA/NRL

Tropical cyclones can become post-tropical before they dissipate, meaning they can become sub-tropical, extra-tropical or a remnant low pressure area.  As Tropical Cyclone Gelena transitioned into a subtropical storm, NASA's Aqua satellite provided a visible image of the storm.

On Feb 15 at 10 p.m. EDT (Feb. 16 at 0300 UTC), the Joint Typhoon Warning Center or JTWC noted that Gelena had already become subtropical. JTWC issued their final warning on Gelena. At that time, the storm had maximum sustained winds near 40 knots (46 mph/74 kph). It was centered near 29.8 degrees south latitude and 89.3 degrees east longitude. That's 1,426 nautical miles east-southeast of Learmonth, Australia. Gelena was moving east-southeast.

What is a Post-tropical Storm? 

A Post-Tropical Storm is a generic term for a former tropical cyclone that no longer possesses sufficient tropical characteristics to be considered a tropical cyclone. Former tropical cyclones that have become fully extratropical, subtropical, or remnant lows, are three classes of post-tropical cyclones. In any case, they no longer possesses sufficient tropical characteristics to be considered a tropical cyclone. However, post-tropical cyclones can continue carrying heavy rains and high winds.

What is a Sub-tropical Storm?

 According to the National Oceanic and Atmospheric Administration, a sub-tropical storm is a low-pressure system that is not associated with a frontal system and has characteristics of both tropical and extratropical cyclones. Like tropical cyclones, they are non-frontal that originate over tropical or subtropical waters, and have a closed surface wind circulation about a well-defined center.

Unlike tropical cyclones, subtropical cyclones derive a significant proportion of their energy from baroclinic sources (atmospheric pressure), and are generally cold-core in the upper troposphere, often being associated with an upper-level low pressure area or an elongated area or trough of low pressure.

In comparison to tropical cyclones, these systems generally have a radius of maximum winds occurring relatively far from the center (usually greater than 60 nautical miles), and are generally less symmetric.

What is an Extra-tropical Storm?

Often, a tropical cyclone will transform into an extra-tropical cyclone as it recurves toward the poles (north or south, depending on the hemisphere the storm is located in). An extra-tropical cyclone is a storm system that primarily gets its energy from the horizontal temperature contrasts that exist in the atmosphere.

Tropical cyclones have their strongest winds near the earth's surface, while extra-tropical cyclones have their strongest winds near the tropopause - about 8 miles (12 km) up. Tropical cyclones, in contrast, typically have little to no temperature differences across the storm at the surface and their winds are derived from the release of energy due to cloud/rain formation from the warm moist air of the tropics.

Visible NASA Imagery Shows the Transition

Visible imagery from NASA's Aqua satellite revealed Gelena's subtropical transition.

At 3:25 a.m. EDT (0825 UTC) on Feb. 15, 2019, the Moderate Resolution Imaging Spectroradiometer or MODIS instrument aboard NASA's Aqua satellite provided a visible image of subtropical storm Gelena in the Southern Indian Ocean.  The MODIS image showed Gelena had a closed surface wind circulation about a well-defined center, but the storm has become asymmetric (as subtropical storms do).

The Joint Typhoon Warning Center noted "animated multispectral satellite imagery shows cloud tops continuing to warm and convective structure continuing to unravel. [Another satellite] image showed remnant convective bands [of thunderstorms] sheared (pushed by winds) to the south and wrapping into an elongated partially exposed low level circulation center."

Gelena is expected to continue moving through the Southern Indian Ocean over the next day until it dissipates.

Credit: 
NASA/Goddard Space Flight Center

Researchers find genetic vulnerability to menthol cigarette use

A genetic variant found only in people of African descent significantly increases a smoker's preference for cigarettes containing menthol, a flavor additive. The variant of the MRGPRX4 gene is five to eight times more frequent among smokers who use menthol cigarettes than other smokers, according to an international group of researchers supported by the U.S. Food and Drug Administration and the National Institutes of Health. The multiethnic study is the first to look across all genes to identify genetic vulnerability to menthol cigarettes. The paper was published online in the journal PLOS Genetics on February 15, 2019.

Menthol provides a minty taste and a cooling or soothing sensation, and plays a particularly troubling role in U.S. cigarette smoking patterns. According to the FDA, nearly 20 million people in the United States smoke menthol cigarettes, which are particularly popular among African-American smokers and teen smokers. In the U.S., 86 percent of African-American smokers use menthol cigarettes, compared to less than 30 percent of smokers of European descent. In addition, menthol cigarettes may be harder to quit than other cigarettes.

Although not originally the focus of the study, researchers also uncovered clues as to how menthol may reduce the irritation and harshness of smoking cigarettes.

"This study sheds light on the molecular mechanisms of how menthol interacts with the body," said Andrew Griffith, M.D., Ph.D., scientific director and acting deputy director of NIH's National Institute on Deafness and Other Communications Disorders (NIDCD). "These results can help inform public health strategies to lower the rates of harmful cigarette smoking among groups particularly vulnerable to using menthol cigarettes."

The research team, led by Dennis Drayna, Ph.D., chief of the Section on Genetics of Communication Disorders at the NIDCD, conducted detailed genetic analyses on 1,300 adults. In the initial analyses, researchers at the University of Texas Southwestern Medical Center, Dallas (UT Southwestern), used data from a multiethnic, population-based group of smokers from the Dallas Heart Study and from an African-American group of smokers from the Dallas Biobank. In conjunction with researchers from the Schroeder Institute® for Tobacco Research, Washington, D.C., the scientists further confirmed their findings in a group of African-American smokers enrolled in the Washington, D.C., Tobacco QuitlineTM.

The researchers report that 5 to 8 percent of the African-American study participants had the gene variant. None of the participants of European, Asian, or Native American descent had the variant.

Identifying the genetic variant pointed the researchers in an unexpected direction, leading them to provide the first characterization of this naturally-occurring MRGPRX4 variant in humans. The gene codes for a sensor, or receptor, that is believed to be involved in detecting and responding to irritants from the environment in the lungs and airways.

"We expected to find genes that relate to taste receptors, since menthol is a flavor additive," said Drayna. "Instead, we discovered a different kind of signaling molecule that appears to be involved in menthol preference."

Collaborators at the University of North Carolina (UNC), Chapel Hill, then worked with the research team to look more closely at the effect of the African-specific variant on the function of the MRGPRX4 receptor. They found that the variant alters a specific type of cell signaling, and that menthol alters this further. Additional studies confirmed that this sensor is found in the airways, suggesting that menthol is likely to affect how we sense irritation in the airways.

"While this gene variant can't explain all of the increased use of menthol cigarettes by African-Americans, our findings indicate that this variant is a potentially important factor that underlies the preference for menthol cigarettes in this population. While things like cultural factors or industry advertising practices have been a focus for understanding menthol use thus far, our findings indicate that African-specific genetic factors also need to be considered," said Drayna.

The FDA has sought public commentary and scientific information on the use of menthol in tobacco products. The agency has announced plans to propose a ban on menthol-flavored cigarettes and cigars, in large part because of the high use of menthol cigarettes among youth and young adults. More than half of smokers ages 12 to 17 smoke menthol-flavored cigarettes. The prevalence rises to 7 out of 10 among African-American youth who smoke, according to the FDA.

Credit: 
NIH/National Institute on Deafness and Other Communication Disorders

The prospects of american strawberries

image: Commercial field production of strawberries.

Image: 
Jayesh Samtani

VIRGINIA BEACH, VA--The Prospects of American Strawberries

A comprehensive review led by Jayesh Samtani of Virginia Tech and Curt Rom of the University of Arkansas encapsulates an understanding of the challenges, needs, and opportunities of strawberry growers across the United States. Samtani and Rom formed and gathered support from a team of 12 researchers from 10 different states as they embarked on an academic journey designed to generate an effective guideline essential for research, policy, and marketing strategies for the strawberry industry across the country, and to enable the development of general and region-specific educational and production tools.

Their findings are summarized in the article "The Status and Future of the Strawberry Industry in the United States", an open-access article published in HortTechnology.

The review divides the United States into eight distinct geographic regions, and an indoor controlled or protected environment production system. A common trend across all regions is the increasing use of protected culture strawberry production with both day-neutral and short-day cultivars for season extension to meet consumer demand for year-round availability.

All regions experience challenges with pests and obtaining adequate harvest labor. Increasing consumer demand for berries, climate change-induced weather variability, high pesticide use, labor and immigration policies, and land availability impact regional production.

The United States produces more than 3 billion pounds of strawberries each year, providing almost 20% of the world crop, and is a global leader in production per unit area. The farm gate economic value of strawberries is just shy of $3 billion per year. With that monetary strength, the US production acreage has increased approximately 17% steadily since 1990, with the largest expansion in Florida and California.

US consumption of strawberries has increased significantly during the past 2 decades, from 2 pounds per capita in 1980 to approximately 8 pounds per capita in more recent years. Consumption is expected to continue to increase as a result of increased awareness of the health benefits associated with berry consumption, year-round availability made possible through domestic production and protected berry culture, increased imports, and improved cultivars.

The future of strawberry production will be dictated both by grower production needs and consumer demands for the fruit. The number of growers who have reduced the use of fumigants has increased. In those regions that rely on fumigants to control soil-borne pests and weeds, there is increasing interest in alternative treatments such as the use of steam, enhanced soil solarization, or hot water treatments.

Until the economic viability of these alternative treatments is determined, growers facing pest pressures at their production sites are continuing to use fumigation, despite regulations against it. Automation and robotics to assist with the more labor-intensive tasks of planting, maintaining, and harvesting will be further developed and used to expand both the regions and seasons of production, thus increasing consumer accessibility and reducing the use of pesticides and the corresponding environmental impact.

Samtani adds, "What started off as a discussion and a general idea between myself and Curt Rom certainly progressed into a benchmark review. From its foundation as a USDA-SCRI planning grant proposal (Planning to Increase the Productivity and Competitiveness of Sustainable Strawberry Systems), the initiative gained momentum over time through our exchanges of ideas and thoughts. This culminated into a workshop at the 2017 ASHS Annual Conference. Speakers were carefully identified, ensuring those with sufficient knowledge, experience, and expertise were chosen to represent the different strawberry production regions of the US. We believe that we have provided a great overview of the different strawberry-producing regions of the United States--a topic that has not been investigated and documented at a national level."

Credit: 
American Society for Horticultural Science

Novel app uses AI to guide, support cancer patients

image: The MyPath app adapts to each stage in a patient's cancer journey.

Image: 
Christopher Moore, Georgia Tech

Artificial Intelligence is helping to guide and support some 50 breast cancer patients in rural Georgia through a novel mobile application that gives them personalized recommendations on everything from side effects to insurance.

The app, called MyPath, adapts to each stage in a patient's cancer journey. So the information available on the app--which runs on a tablet computer--regularly changes based on each patient's progress. Are you scheduled for surgery? MyPath will tell you what you need to know the day before.

"Patients have told us, 'It just seemed to magically know what I needed,'" said Elizabeth Mynatt, principal investigator for the work and Distinguished Professor in the School of Interactive Computing at Georgia Tech.

Mynatt, who is also Executive Director of the Institute for People and Technology, believes that MyPath is the first healthcare app capable of personalization (through its application of AI) for holistic cancer care. In addition to incorporating a patient's medical data, the app also addresses a variety of other relevant issues such as social and emotional needs.

She will present the work February 15 at the 2019 annual meeting of the American Association for the Advancement of Science.

National Recognition

In January MyPath was recognized by iSchools, a consortium of some 100 institutions worldwide (including Georgia Tech) dedicated to advancing the information field. Maia Jacobs, who recently received her Ph.D. from Georgia Tech for her work on MyPath, was named winner of the 2019 iSchools Doctoral Dissertation Award.

According to iSchools, "the Award Committee felt [that Jacobs' work] was timely and important, and lauded its impact in how patients manage their health." Jacobs, now a postdoctoral fellow at Harvard, is currently exploring how to expand MyPath to other diseases.

The work was also honored in 2016 when it was featured in a report to President Barack Obama by the President's Cancer Panel. The report, Improving Cancer-Related Outcomes with Connected Health, aimed to "help patients manage their health information and participate in their own care," according to a Georgia Tech story at the time.

The Beginning

Six years ago Mynatt's team began working with the Harbin Clinic in Rome, Georgia. "They have a tremendous program in holistic cancer care where they recognize that their patients, who are from a large rural area, face a variety of challenges to be able to successfully navigate the cancer journey," Mynatt said.

But the Harbin doctors and cancer navigators--people who help patients through the cancer journey--wanted a better way to stay connected to patients on a regular basis. The navigators, in particular, found that they tended to interact with patients a great deal at diagnosis, but less frequently over time. And that meant that although there are many recommendations for, say, lowering anxiety, they weren't necessarily being communicated.

Said Mynatt, "We wondered how technology could amplify what these great people are doing."

How it Works

MyPath begins with a mobile library of resources compiled from the American Cancer Society and other reputable organizations. Then, it is personalized with each patient's diagnosis and treatment plan, including the dates for specific procedures. Patients also complete regular surveys that help inform the system--and caregivers--of their changing needs and symptoms.

The result is a system that provides each patient with resources and suggestions specific to their personal situation. Because MyPath knows, for example, that you have stage 2 breast cancer and will be undergoing a lumpectomy on a specific date, when you click on the category "Preparing for Surgery" it will suggest relevant articles to prepare you for what's ahead. Have you reported nausea in the system's survey? MyPath will bring your attention to resources that can help combat the side effect. The system also provides quick access to contact information for specific caregivers.

Other apps--and the Internet--aren't personalized. That means slogging through a great deal of often technical information that's not relevant to your situation. In contrast, "Every day MyPath puts the right resources at your fingertips to help you through your cancer journey," Mynatt said.

More than Medical

Some of MyPath's most popular features have nothing to do directly with cancer. Buttons for "Emotional Support" and "Day to Day Matters" are regularly consulted by patients. "When we asked them about how they used the tablet for healthcare, many patients would talk to us about playing Angry Birds, which they would download to distract them during chemo sessions," Mynatt said.

MyPath is the second generation of the app. Patient feedback from its predecessor, My Journey Compass, led to changes including the personalization. Development continues. For example, Mynatt's team is hoping to expand the app for use by cancer survivors, who often face additional challenges like hormone replacement therapy. The team is also working on a version that individual patients could download, which would make the app available to many more users.

Credit: 
Georgia Institute of Technology

'Cellular barcoding' reveals how breast cancer spreads

image: This photo shows from left to right Dr. Shalin Naik, Professor Jane Visvader, Dr. Tom Weber and Dr. Delphine Merino.

Image: 
Walter and Eliza Hall Institute of Medical Research

A cutting-edge technique called cellular barcoding has been used to tag, track and pinpoint cells responsible for the spread of breast cancer from the main tumour into the blood and other organs.

The technique also revealed how chemotherapy temporarily shrinks the number of harmful cells, rather than eliminating them, explaining how the cancer could eventually relapse.

Insights from the study, published today in Nature Communications, could lead to new targeted treatments for breast cancer, the most common cancer to affect women.

Dr Delphine Merino, Dr Tom Weber, Professor Jane Visvader, Professor Geoffrey Lindeman and Dr Shalin Naik led the highly collaborative research that involved breast cancer biologists, clinician scientists, biotechnologists and computational experts at the Walter and Eliza Hall Institute of Medical Research.

Pinpointing the ‘seeders’ of disease

Most deaths from breast cancer are caused by the metastasis, or spread, of cancerous cells from the main tumour site into other organs.

Breast cancers consist of thousands of different cell variants with diverse characteristics that may or may not play a role in the metastasis of the cancer. This makes effective treatment a challenge because it is difficult to know which cells are responsible for driving the spread of cancer.

Dr Merino said the ability to pinpoint the 'clones' - subpopulations of cells arising from an original patient tumour - responsible for the spread of cancer was crucial for improving treatments.

"Our study revealed that only a select few clones were actually responsible for the metastasis.

"The barcoding technique enabled us to identify the clones that were able to get into the blood stream and make their way into other organs where they would 'seed' new tumour growth," Dr Merino said.

Professor Visvader said the technique also allowed the researchers to see what was happening to the clones after chemotherapy was introduced.

"We used the chemotherapy agent Cisplatin to treat laboratory models developed using donated breast tumour tissue. While the treatment was able to shrink tumours and the size of individual clones, it did not kill them off completely. All the clones, including the nasty seeders, eventually grew again, accounting for cancer relapse.

"These exciting findings would not have been possible without the ability to meticulously barcode and track thousands of individual clones and watch their behaviour over time," she said.

New technique 'tags and tracks'

The cellular barcoding technique used for the study was developed in 2013 by Dr Naik and Professor Ton Schumacher from the Netherlands Cancer Institute.

Dr Naik said this new technique meant researchers could go from studying thousands of clones, to homing in on the select few variants responsible for the spread of cancer.

"Now that we know which clones are involved in the spread of breast cancer, we have the power to really focus our research to block their activity. For instance, we are curious to understand what is unique about these particular clones that enables them to successfully spread, seed and grow the cancer," Dr Naik said.

Enabling a targeted approach to treatment

Professor Visvader said the precision of the approach could pave the way for unravelling important mysteries in the field of breast cancer research and equip scientists with the information needed to design highly targeted treatment strategies for the prevalent disease.

"An important goal is to understand the molecular and cellular basis of how breast cancer spreads and, working with clinician scientists like Professor Lindeman, translate this knowledge from the laboratory into the clinic," she said.

Credit: 
Walter and Eliza Hall Institute

'Seeing' tails help sea snakes avoid predators

image: This is an olive sea snake (Aipysurus laevis) diving underwater. Sea snakes live their entire lives at sea and must come up to the sea surface to breath air.

Image: 
Chris Malam

New research has revealed the fascinating adaptation of some Australian sea snakes that helps protect their vulnerable paddle-shaped tails from predators.

An international study led by the University of Adelaide shows that several species of Australian sea snakes can sense light on their tail skin, prompting them to withdraw their tails under shelter. The study has also produced new insights into the evolution and genetics of this rare light sense.

The researchers found that olive sea snakes (Aipysurus laevis) and other Aipysurus species move their tail away from light. They believe this is an adaptation to keep the tail hidden from sharks and other predators.

"Sea snakes live their entire lives at sea, swimming with paddle-shaped tails and resting at times during the day under coral or rocky overhangs," says study lead author Jenna Crowe-Riddell, PhD candidate in the University of Adelaide's School of Biological Sciences. "Because sea snakes have long bodies, the tail-paddle is a large distance from the head, so benefits from having a light-sense ability of its own.

"The olive sea snake was the only reptile, out of more than 10,000 reptile species, that was known to respond to light on the skin in this way."

The researchers tested for light-sensitive tails in eight species of sea snakes, but found that only three species had the light-sense ability. They concluded the unique ability probably evolved in the ancestor of just six closely related Australian species.

"There are more than 60 species of sea snake so that's less than 10% of all sea snakes," says Ms Crowe-Riddell. "We don't know why this rare sense has evolved in just a few Aipysurus species."

The researchers used RNA sequencing to see what genes are active in the skin of sea snakes. They discovered a gene for a light-sensitive protein called melanopsin, and several genes that are involved in converting light into information in the nervous system.

"Melanopsin is used in a range of genetic pathways that are linked to sensing overall light levels around us. It is even used by some animals, including humans, for regulating sleep cycles and in frogs to change their skin colour as a camouflage," says Ms Crowe-Riddell.

Lead scientist Dr Kate Sanders, ARC Future Fellow at the University of Adelaide, says: "We've confirmed the ability of olive sea snakes to sense light in their tails and found the same ability in two other species. We've identified a shortlist of genes that are likely to be involved in detecting light. But further study will be needed to target these genes before we can really understand the genetic pathways involved in this fascinating behaviour."

Credit: 
University of Adelaide

New molecular blueprint advances our understanding of photosynthesis

image: A cartoon schematic of electron transport chain of photosynthesis in which energy from sunlight creates high-energy electrons that are shuttle among various protein complexes. The electron shuttling process is coupled with proton pumps that power ATP formation by ATP synthase. An electron can flow linearly to power NADPH formation or it can be cycled between photosystem I and NDH to boost ATP synthesis.

Image: 
Thomas Laughlin/UC Berkeley and Berkeley Lab

Researchers at the Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) have used one of the most advanced microscopes in the world to reveal the structure of a large protein complex crucial to photosynthesis, the process by which plants convert sunlight into cellular energy.

The finding, published in the journal Nature, will allow scientists to explore for the first time how the complex functions and could have implications for the production of a variety of bioproducts, including plastic alternatives and biofuels.

"This work will lead to a better understanding of how photosynthesis occurs, which could allow us to improve the efficiency of photosynthesis in plants and other green organisms - potentially boosting the amount of food, and thus biomass, they produce," said lead researcher Karen Davies, a biophysicist at Berkeley Lab. "This is particularly important if you want to produce renewable bioproducts that are cost-effective alternatives to current petroleum-based products."

Discovered decades ago, the protein complex targeted by the researchers, called NADH dehydrogenase-like complex (NDH), is known to help regulate the phase of photosynthesis where the energy of sunlight is captured and stored in two types of cellular energy molecules, which are later utilized to power the conversion of carbon dioxide into sugar. Past investigations revealed that NDH reshuffles the energized electrons moving among other protein complexes in the chloroplast in a way that ensures the correct ratio of each energy molecule is produced. Furthermore, NDH of cyanobacteria performs several additional roles including increasing the amount of carbon dioxide (CO2) available for sugar production by linking CO2 uptake with electron transfer.

In order for scientists to truly comprehend how NDH executes these important functions, they needed a molecular blueprint indicating the location and connectivity of all the atoms in the complex. This is something that even highly powerful transmission electron microscopy (TEM) technology simply could not provide until very recently.

"Research on this enzyme has been difficult and experimental results confounding for the last 20 years or so because we have lacked complete information about the enzyme's structure," said Davies. "Knowing the structure is important for generating and testing out hypotheses of how the enzyme functions. The resolution we obtained for our structure of NDH has only really been achievable since the commercialization of the direct electron counting camera, developed in collaboration with Berkeley Lab."

Prior to this invention, explained Davies, a staff scientist in Berkeley Lab's Molecular Biophysics and Integrative Bioimaging Division (MBIB), determining the structure of a single molecule could take several years because cryo-TEM imaging relied on film, meaning that each exposure had to be developed and scanned before it could be analyzed. The main limitation, however, was that most images turned out blurry. When you directed a beam of electrons at a molecule, the charged, high-energy particles excited the atoms in the molecule, often making them move at the moment of exposure. This meant that researchers needed to take and process hundreds, if not thousands, of film images in order to get an accurate glimpse of an entire molecule.

The new electron counting camera solves this problem by taking digital movies with an extremely high frame rate, so individual frames can be aligned to eliminate blurring caused by beam-induced particle motion.

In the current study, first author Thomas Laughlin, a UC Berkeley graduate student with a joint appointment at MBIB, isolated NDH complexes from membranes of a photosynthetic cyanobacterium provided by the Junko Yano and Vittal Yachandra Lab in MBIB and imaged them using a state-of-the-art cryo-TEM instrument fitted with the latest direct electron detector. Located on the UC Berkeley campus, the cryo-TEM facility is managed by the Bay Area CryoEM consortium, which is partly funded by Berkeley Lab.

The resulting atom density map was then used to build a model of NDH that shows the arrangement of all the protein subunits of NDH and the most likely position of all the atoms in the complex. By examining this model, Davies' team will be able to formulate and then test hypotheses of how NDH facilitates sugar production by balancing the ratio of the two cellular energy molecules.

"While the structure of NDH alone certainly addresses many questions, I think it has raised several more that we had not even thought to consider before," said Laughlin.

Among the many Berkeley Lab scientists focused on advancing knowledge of fundamental biochemical and biophysical processes, Davies and her staff also use direct electron camera cryo-EM to investigate how variations in the organization of photosynthetic complexes, caused by changes in growth and light conditions, affect the efficiency of photosynthesis. Her project on electron flow in photosynthesis is supported by a five-year DOE Office of Science Early Career Research Program grant that was awarded in 2018.

Credit: 
DOE/Lawrence Berkeley National Laboratory

Novel software offers possible reduction in arrhythmic heart disease

Potentially lethal heart conditions may become easier to spot and may lead to improvements in prevention and treatment thanks to innovative new software that measures electrical activity in the organ.

The heart's pumping ability is controlled by electrical activity that triggers the heart muscle cells to contract and relax. In certain heart diseases such as arrhythmia, the organ's electrical activity is affected.

Cardiac researchers can already record and analyse the heart's electrical behaviour using optical and electrode mapping, but widespread use of these technologies is limited by a lack of appropriate software.

Computer and cardiovascular experts at the University of Birmingham have worked with counterparts in the UK, Netherlands and Australia to develop ElectroMap - a new open-source software for processing, analysis and mapping complex cardiac data.

Led by researchers from the School of Computer Science and the Institute of Cardiovascular Sciences, at the University of Birmingham, the international team has published its findings in Scientific Reports.

Dr Kashif Rajpoot, Senior Lecturer and Programme Director for Computer Science at the University of Birmingham Dubai, commented: "We believe that ElectroMap will accelerate innovative cardiac research and lead to wider use of mapping technologies that help to prevent the incidence of arrhythmia.

"This is a robustly validated open-source flexible tool for processing and by using novel data analysis strategies we have developed, this software will provide a deeper understanding of heart diseases, particularly the mechanisms underpinning potentially lethal arrhythmia."

The incidence and prevalence of cardiac disease continues to increase every year, but improvements in prevention and treatment require better understanding of electrical behaviour across the heart.

Data on this behaviour can be gathered using electrocardiogram tests, but more recently, optical mapping has allowed wider measurement of cardiovascular activity in greater detail. Insights from optical mapping experiments have given researchers a better understanding of complex arrhythmias and electrical behaviour in heart disease.

"Increased availability of optical mapping hardware in the laboratory has led to expansion of this technology, but further uptake and wider application is hindered by limitations with respect to data processing and analysis," said Dr Davor Pavlovic - lead contributor from the University of Birmingham's Institute of Cardiovascular Sciences. "The new software can detect, map and analyse arrhythmic phenomena for in silico, in cellulo, animal model and in vivo patient data."

Credit: 
University of Birmingham

Researchers create ultra-lightweight ceramic material that withstands extreme temperatures

image: Breath mint-sized samples of the ceramic aerogels developed by a UCLA-led research team. The material is 99 percent air by volume, making it super lightweight.

Image: 
UCLA Samueli Engineering

UCLA researchers and collaborators at eight other research institutions have created an extremely light, very durable ceramic aerogel. The material could be used for applications like insulating spacecraft because it can withstand the intense heat and severe temperature changes that space missions endure.

Ceramic aerogels have been used to insulate industrial equipment since the 1990s, and they have been used to insulate scientific equipment on NASA's Mars rover missions. But the new version is much more durable after exposure to extreme heat and repeated temperature spikes, and much lighter. Its unique atomic composition and microscopic structure also make it unusually elastic.

When it's heated, the material contracts rather than expanding like other ceramics do. It also contracts perpendicularly to the direction that it's compressed -- imagine pressing a tennis ball on a table and having the center of the ball move inward rather than expanding out -- the opposite of how most materials react when compressed. As a result, the material is far more flexible and less brittle than current state-of-the-art ceramic aerogels: It can be compressed to 5 percent of its original volume and fully recover, while other existing aerogels can be compressed to only about 20 percent and then fully recover.

The research, which was published today in Science, was led by Xiangfeng Duan, a UCLA professor of chemistry and biochemistry; Yu Huang, a UCLA professor of materials science and engineering; and Hui Li of Harbin Institute of Technology, China. The study's first authors are Xiang Xu, a visiting postdoctoral fellow in chemistry at UCLA from Harbin Institute of Technology; Qiangqiang Zhang of Lanzhou University; and Menglong Hao of UC Berkeley and Southeast University.

Other members of the research team were from UC Berkeley; Purdue University; Lawrence Berkeley National Laboratory; Hunan University, China; Lanzhou University, China; and King Saud University, Saudi Arabia.

Despite the fact that more than 99 percent of their volume is air, aerogels are solid and structurally very strong for their weight. They can be made from many types of materials, including ceramics, carbon or metal oxides. Compared with other insulators, ceramic-based aerogels are superior in blocking extreme temperatures, and they have ultralow density and are highly resistant to fire and corrosion -- all qualities that lend themselves well to reusable spacecraft.

But current ceramic aerogels are highly brittle and tend to fracture after repeated exposure to extreme heat and dramatic temperature swings, both of which are common in space travel.

The new material is made of thin layers of boron nitride, a ceramic, with atoms that are connected in hexagon patterns, like chicken wire.

In the UCLA-led research, it withstood conditions that would typically fracture other aerogels. It stood up to hundreds of exposures to sudden and extreme temperature spikes when the engineers raised and lowered the temperature in a testing container between minus 198 degrees Celsius and 900 degrees above zero over just a few seconds. In another test, it lost less than 1 percent of its mechanical strength after being stored for one week at 1,400 degrees Celsius.

"The key to the durability of our new ceramic aerogel is its unique architecture," Duan said. "Its innate flexibility helps it take the pounding from extreme heat and temperature shocks that would cause other ceramic aerogels to fail."

Ordinary ceramic materials usually expand when heated and contract when they are cooled. Over time, those repeated temperature changes can lead those materials to fracture and ultimately fail. The new aerogel was designed to be more durable by doing just the opposite -- it contracts rather than expanding when heated.

In addition, the aerogel's ability to contract perpendicularly to the direction that it's being compressed -- like the tennis ball example -- help it survive repeated and rapid temperature changes. (That property is known as a negative Poisson's ratio.) It also has interior "walls" that are reinforced with a double-pane structure, which cuts down the material's weight while increasing its insulating abilities.

Duan said the process researchers developed to make the new aerogel also could be adapted to make other ultra-lightweight materials.

"Those materials could be useful for thermal insulation in spacecraft, automobiles or other specialized equipment," he said. "They could also be useful for thermal energy storage, catalysis or filtration."

Credit: 
University of California - Los Angeles

Merging neutron stars

image: Simulation of merging neutron stars calculated with supercomputers. Different colors show the mass density and the temperature some time after the merger has taken place and shortly before the object collapses to a black hole. Quarks are expected to form where the temperature and density are higher.

Image: 
Copyright: C. Breu, L. Rezzolla

The option to measure the gravitational waves of two merging neutron stars has offered the chance to answer some of the fundamental questions about the structure of matter. At the extremely high temperatures and densities in the merger scientists conjecture a phase-transition where neutrons dissolve into their constituents: quarks and gluons. In the current issue of Physical Review Letters, two international research groups report on their calculations of what the signature of such a phase transition in a gravitational wave would look like.

Quarks, the smallest building-blocks of matter, never appear alone in nature. They are always tightly bound inside the protons and neutrons. However, neutron stars, weighing as much as the Sun, but being just the size of a city like Frankfurt, possess a core so dense that a transition from neutron matter to quark matter may occur. Physicists refer to this process as a phase transition, similar to the liquid-vapor transition in water. In particular, such a phase transition is in principle possible when merging neutron stars form a very massive meta-stable object with densities exceeding that of atomic nuclei and with temperatures 10,000 times higher than in the Sun's core.

The measurement of gravitational waves emitted by merging neutron stars could serve as a messenger of possible phase transitions in outer space. The phase transition should leave a characteristic signature in the gravitational-wave signal. The research groups from Frankfurt, Darmstadt and Ohio (Goethe University/FIAS/GSI/Kent University) as well as from Darmstadt and Wroclaw (GSI/Wroclaw University) used modern supercomputers to calculate what this signature could look like. For this purpose, they used different theoretical models of the phase transition.

In case a phase transition takes place more after the actual merger, small amounts of quarks will gradually appear throughout the merged object. "With aid of the Einstein equations, we were able to show for the first time that this subtle change in the structure will produce a deviation in the gravitational-wave signal until the newly formed massive neutron star collapses under its own weight to form a black hole," explains Luciano Rezzolla, who is a professor for theoretical astrophysics at the Goethe University.

In the computer models of Dr. Andreas Bauswein from GSI Helmholtzzentrum für Schwerionenforschung in Darmstadt a phase transition already happens directly after the merger -- a core of quark matter forms in the interior of the central object. "We succeeded to show that in this case there will be a distinct shift in the frequency of the gravitational wave signal," says Bauswein. "Thus, we identified a measurable criterion for a phase transition in gravitational waves of neutron star mergers in the future."

Not all of the details of the gravitational-wave signal are measurable with current detectors yet. However, they will become observable both with the next generation of detectors, as well as with a merger event relatively close to us. A complementary approach to answer the questions about quark matter is offered by two experiments: By colliding heavy ions at the existing HADES setup at GSI and at the future CBM detector at the Facility for Antiproton and Ion Research (FAIR), which is currently under construction at GSI, compressed nuclear matter will be produced. In the collisions, it might be possible to create temperatures and densities that are similar to those in a neutron-star merger. Both methods give new insights into the occurrence of phase transitions in nuclear matter and thus into its fundamental properties.

Credit: 
Helmholtz Association

Improved RNA data visualization method gets to the bigger picture faster

Like going from a pinhole camera to a Polaroid, a significant mathematical update to the formula for a popular bioinformatics data visualization method will allow researchers to develop snapshots of single-cell gene expression not only several times faster but also at much higher-resolution. Published in Nature Methods, this innovation by Yale mathematicians will reduce the rendering time of a million-point single-cell RNA-sequencing (scRNA-seq) data set from over three hours down to just fifteen minutes.

Scientists say the existing decade-old method, t-distributed Stochastic Neighborhood Embedding (t-SNE), is great for representing patterns in RNA sequencing data gathered at the single cell level, scRNA-seq data, in two dimensions. "In this setting, t-SNE 'organizes' the cells by the genes they express and has been used to discover new cell types and cell states," said George Linderman, lead author and a Yale M.D.-Ph.D. student specializing in applied mathematics.

By computational standards, t-SNE is quite slow. Thus, researchers often "downsample" their scRNA-seq dataset -- take a smaller sample from the initial sample -- before applying t-SNE. However, downsampling is a poor compromise, as it makes it unlikely for t-SNE to capture rare cell populations, which are often what researchers most want to identify.

More than 30 years ago, another team of Yale mathematicians developed the fast multipole method (FMM), a revolutionary numerical technique that sped up the calculation of long-ranged forces in the n-body problem. The researchers on this study recognized that the principles behind the FMM could also be applied to nonlinear dimensional reduction problems, such as t-SNE, and accelerated t-SNE until it earned its new name: FIt-SNE, or fast interpolation-based t-SNE.

"Using our approach, researchers can not only analyze single cell RNA-sequencing data faster, but it also can be used to characterize rare cell subpopulations that cannot be detected if the data is subsampled prior to t-SNE," said Yuval Kluger, senior author and Yale professor of pathology. Additionally, the team used a heatmap-style visualization for its FIt-SNE results, which makes it easy for researchers to see the expression patterns of thousands of genes at the level of single cells simultaneously.

The researchers said 2019 couldn't be a better new year for t-SNE to get "FIt." In December 2018, Science Magazine named tracking development of embryos cell by cell -- impossible to accomplish without visualizations based on scRNA-seq data -- the Breakthrough of the Year. FIt-SNE will speed up further work in this field of developmental biology as well as in fields such as neuroscience and cancer research, where single-cell sequencing has become an invaluable tool for mapping the brain and understanding tumors, said the researchers.

Credit: 
Yale University

Men's porn habits could fuel partners' eating disorders

image: Tracy Tylka's research found that women whose partners frequently watched porn were more likely to report symptoms of eating disorders. Symptoms were also more common for those who felt pressure from their partner to be thin.

Image: 
The Ohio State University

COLUMBUS, Ohio - A woman whose boyfriend or husband regularly watches pornography is more likely to report symptoms of an eating disorder, new research suggests.

The study is one of the first to look at how a romantic partner's behavior might be linked to the likelihood of a woman experiencing or engaging in such things as extreme guilt about eating, preoccupation with body fat, binging or purging.

In addition to finding an association between a partner's porn habits and eating disorder symptoms, the research also found a higher incidence of those symptoms in women who said they feel pressure from their boyfriends or husbands to be thin.

The study, led by researchers at The Ohio State University, appears in the International Journal of Eating Disorders.

"We often talk about the influences of media, family and friends on eating disorders, but little has been done to determine how a partner's influence might contribute to a woman's disordered eating," said Tracy Tylka, a professor of psychology at Ohio State's Columbus and Marion campuses. "It's a gap in the research and if certain partner variables are risk factors we should be giving them more attention."

The study is also the first research of its kind to address these partner influences in women who are older and more likely to be in long-term relationships.

"The women who were part of this study had an average age of almost 34, and were from a broader demographic than the stereotypical white adolescent girl with anorexia," Tylka said.

"Disordered eating affects many people who do not fit this description - as many as 20 to 25 percent of women - and this study helps us better understand the influences on these women."

The participants, 409 U.S. women in relationships with men, answered a questionnaire designed to identify symptoms of eating disorders and answered questions about perceived pressure from the media and others (partners, friends and family) in their lives to lose weight and have a thin body. They also reported how many hours of pornography their current partner viewed per week, ranging from none to more than eight hours, and estimated how often their previous partners had viewed pornography on a scale ranging from never to almost always.

The researchers then analyzed the relationships between those responses and found a clear association between eating disorder symptoms and both perceived partner pressure to be thin and pornography use.

"In many categories of eating disorder symptoms, perceived pressure from a romantic partner to be thin appeared to be more detrimental than pressure from friends or family, or even the media," Tylka said.

And both partner pornography viewing and pressure to be thin appeared to be associated with a woman's disordered eating behavior even if she didn't idealize thinness, according to the study.

That's important to note, Tylka said, because women may be responding solely to what they think their partner values, even if they don't value that "thin body ideal" for themselves.

Tylka said she was interested in the potential relationship between partner pornography use and eating disorders because it could prompt women to feel pressured to aspire to unrealistic body types, or to "feel sexless because their partners are spending time with porn instead of connecting with them."

"The relationship between partner pornography use and disordered eating was stronger for this group of women than for college women we've previously studied. That could be because these women have had more relationship experiences, and these experiences have shaped their relationships with food and their perceptions of their bodies," Tylka said.

The study did not examine potential differences between women who watch pornography with their partners and those whose partners view pornography alone.

Tylka said further study is warranted in the area of partner influences on disordered eating among older women. Understanding these factors could help improve eating disorder prevention and treatment, she said.

"Some professionals are already advocating for integrating partners in eating disorder prevention and treatment, and these findings support this argument."

Credit: 
Ohio State University

Bigger teams aren't always better in science and tech

image: A forest of trees where each tree is a project and a person supports the kind of deep searching (roots) and highly disruptive (branches) work produced by small teams.

Image: 
Data Illustration by Lingfei Wu/University of Chicago Knowledge Lab

In today's science and business worlds, it's increasingly common to hear that solving big problems requires a big team. But a new analysis of more than 65 million papers, patents and software projects found that smaller teams produce much more disruptive and innovative research.

In a new paper published by Nature, University of Chicago researchers examined 60 years of publications and found that smaller teams were far more likely to introduce new ideas to science and technology, while larger teams more often developed and consolidated existing knowledge.

While both large and small teams are essential for scientific progress, the findings suggest that recent trends in research policy and funding toward big teams should be reassessed.

"Big teams are almost always more conservative. The work they produce is like blockbuster sequels; very reactive and low-risk." said study co-author James Evans, professor of sociology, director of the Knowledge Lab at UChicago and a leading scholar in the quantitative study of how ideas and technologies emerge. "Bigger teams are always searching the immediate past, always building on yesterday's hits. Whereas the small teams, they do weird stuff--they're reaching further into the past, and it takes longer for others to understand and appreciate the potential of what they are doing."

Knowledge Lab is a unique research center that combines "science of science" approaches from sociology with the explosion of digital information now available on the history of research and discovery. By using advanced computational techniques and developing new tools, Knowledge Lab researchers reconstruct and examine how knowledge over time grows and influences our world, generating insights that can fuel future innovation.

The Nature study collected 44 million articles and more than 600 million citations from the Web of Science database, 5 million patents from the U.S. Patent and Trademark Office, and 16 million software projects from the Github platform. Each individual work in this massive dataset was then computationally assessed for how much it disrupted versus developed its field of science or technology.

"Intuitively, a disruptive paper is like the moon during the lunar eclipse; it overshadows the sun -- the idea it builds upon -- and redirects all future attention to itself," said study co-author Lingfei Wu, a postdoctoral researcher with the University of Chicago and Knowledge Lab. "The fact that most of the future works only cite the focal paper and not its references is evidence for the 'novelty' of the focal paper. Therefore, we can use this measure, originally proposed by Funk and Owen-Smith, as a proxy for the creation of new directions in the history of science and technology."

Across papers, patents and software products, disruption dramatically declined with the addition of each additional team member. The same relationship appeared when the authors controlled for publication year, topic or author, or tested subsets of data, such as Nobel Prize-winning articles. Even review articles, which simply aggregate the findings of previous publications, are more disruptive when authored by fewer individuals, the study found.

The main driver of the difference in disruption between large and small teams appeared to be how each treat the history of their field. Larger teams were more likely to cite more recent, highly cited research in their work, building upon past successes and acknowledging problems already in their field's zeitgeist. By contrast, smaller teams more often cited older, less popular ideas, a deeper and wider information search that creates new directions in science and technology.

"Small teams and large teams are different in nature," Wu said. "Small teams remember forgotten ideas, ask questions and create new directions, whereas large teams chase hotspots and forget less popular ideas, answer questions and stabilize established paradigms."

The analysis shows that both small and large teams play important roles in the research ecosystem, with the former generating new, promising insights that are rapidly developed and refined by larger teams. Some experiments are so expensive, like the Large Hadron Collider or the search for dark energy, that they can only be answered by a single, massive collaboration. But other complex scientific questions may be more effectively pursued by an ensemble of independent, risk-taking small teams rather than a large consortium, the authors argue.

"In the context of science, funders around the world are funding bigger and bigger teams," Evans said. "What our research proposes is that you really want to fund a greater diversity of approaches. It suggests that if you really want to build science and technology, you need to act like a venture capitalist rather than a big bank -- you want to fund a bunch of smaller and largely disconnected efforts to improve the likelihood of major, pathbreaking success."

"Most things are going to fail, or are not going to push the needle within a field. As a result it's really about optimizing failure," Evans added. "If you want to do discovery, you have to gamble."

Credit: 
University of Chicago

Shedding light on the pathway to put the traumatic past behind

video: In the conventional fear reduction procedure (top), the CS group are subjected to the anxiety-producing sound without electric shocks. Since a freezing response decreases at a slow rate, mice exhibited extreme freezing responses both at the first (a) and second (b) auditory stimulations. In the ABS pairing (bottom), moving LED light was presented with the second auditory stimulation. While an extreme freezing response was also seen in mice at the first auditory stimulation (c), the pairing auditory stimulation with moving light (d) induced a significant decrease in mice's freezing response.

Image: 
IBS

Suppose you are visually tracking a moving light swinging side to side. Your attention is naturally diverted to that movement, and what was in your mind before gets placed to the side. This alternating bilateral sensory stimulation (ABS), as part of eye movement desensitization and reprocessing (EMDR) is assumed to support the neural integrating of new perspectives and healing of negatively charged memories. Though this treatment has been recognized for long-lasting healing effects, its underlying neural basis has remained unclear. Because of the lack of scientific explanations, many psychiatric doctors shun this form of therapy, although it is listed in many psychotherapy manuals. Researchers from the Center for Cognition and Sociality within the Institute for Basic Science (IBS) identified the brain pathway where ABS works to induce a persistent fear reduction.

In EMDR, patients are instructed to recall a traumatic memory while receiving ABS. Given this connected visual-attentional processes is commonly used for post-traumatic stress disorder (PTSD) patients, the researchers hypothesized that a brain region responsible for the eye-movement and attention - the superior colliculus (SC) - may be involved in the fear-reducing effect of ABS.

The researchers first examined whether ABS-paired treatment prevents the return of fear. To form a fear memory in mice, the researchers first subjected them to a sound while also giving them a mild foot-shock, training them to associate the sound with a painful experience. Fear in the mice was seen in them freezing in location. The mice were then repeatedly exposed to the anxiety-producing sound, but now without the electric shocks, until they no longer found the sound to be stressful. This is known as fear extinction therapy. This conventional exposure therapy in humans is often followed by a severe relapse of symptoms. To test the effects of visual simulation on fear responses in mice, the researchers placed the mice in a cylinder-shaped container with LEDs installed on the wall.

Conventional fear reduction procedure eased down fear responses (freezing) with repeated exposure to the sound in the location where the foot shocks previously took place. However, fear responses often return when the sound is presented one week later in the same location or in a new location. In contrast, the visual stimulation (moving light) with the sound, which is ABS pairing, brought a persistent fear reduction without a significant return, confirming the ABS pairing's powerful effect to reduce fear.

The researchers found enhanced neuronal activities in the SC and the mediodorsal thalamic nucleus (MD) that receives inputs from the SC. They wondered if this SC-MD pathway might be the route where ABS travels, thereby resulting in the reduction of fear. "To confirm this causal link, we blocked the SC-MD route during ABS pairing by using a yellow laser light," says Jinhee Baek, one of the first authors of the study. This modulation blocked the effect of ABS and brought a significant return of fear. Conversely, when blue laser light stimulated the neuronal activities in the SC-MD pathway, mice showed significantly reduced freezing without fear relapses. Using these experiments, the researchers identified the SC-MD pathway is essential for preventing the return of fear.

Mr. Baek says, "Then we wondered which mechanism suppresses the expression of fear." The researchers looked into the basolateral amygdala (BLA), a brain area that controls fear expression and stores fear memory. ABS pairing inhibited fear-expressing neurons in the BLA. Using a yellow laser light, the researchers blocked the MD-BLA pathway. The blockage induced excitatory activities and also delayed inhibitory responses in the BLA. "This study discovered a novel function of MD neurons in suppressing BLA fear responses" explains Mr. Baek.

Studies using animal models have focused on direct approaches, by removing the original fear memory with chemicals that impair synapses or neurons, making these approaches inappropriate for clinical applications. Also, current psychotherapeutic methods have been used in humans without a clear understanding of how those treatments help reduce traumatic symptoms. The study discovered neuronal circuits underlying the psychotherapeutic method. The superior colliculus (SC), only known to be responsible for eye-movement and attention, was previously not considered to be involved in the modulation of learned fear responses. Notably, the neuronal pathway reported in this study induces more stable inhibition of fear without significant return of fear responses. "By shedding light on the underlying brain circuits of ABS pairing's powerful effects to reduce fear, this study can come as a powerful reassurance of its fear-reducing effects to PTSD patients," says Dr. Hee-Sup Shin, one of corresponding authors of the study.

Credit: 
Institute for Basic Science

VUMC researchers, supercomputing effort reveal antibody secrets

image: James Crowe, Jr., M.D., director of the Vanderbilt Vaccine Center.

Image: 
Vanderbilt University Medical Center

Using sophisticated gene sequencing and computing techniques, researchers at Vanderbilt University Medical Center (VUMC) and the San Diego Supercomputer Center have achieved a first-of-its-kind glimpse into how the body's immune system gears up to fight off infection.

Their findings, published this week in the journal Nature, could aid development of "rational vaccine design," as well as improve detection, treatment and prevention of autoimmune diseases, infectious diseases, and cancer.

"Due to recent technological advances, we now have an unprecedented opportunity to harness the power of the human immune system to fundamentally transform human health," Wayne Koff, PhD, CEO of the Human Vaccines Project, which led the research effort, said in a news release.

The study focused on antibody-producing white blood cells called B cells. These cells bear Y-shaped receptors that, like microscopic antenna, can detect an enormous range of germs and other foreign invaders.

They do this by randomly selecting and joining together unique sequences of nucleotides (DNA building blocks) known as receptor "clonotypes." In this way a small number of genes can lead to an incredible diversity of receptors, allowing the immune system to recognize almost any new pathogen.

Understanding exactly how this process works has been daunting. "Prior to the current era, people assumed it would be impossible to do such a project because the immune system is theoretically so large," said James Crowe Jr. MD, director of the Vanderbilt Vaccine Center and the paper's senior author.

"This new paper shows it is possible to define a large portion," Crowe said, "because the size of each person's B cell receptor repertoire is unexpectedly small."

The researchers isolated white blood cells from three adults, and then cloned and sequenced up to 40 billion B cells to determine their clonotypes. They also sequenced the B-cell receptors from umbilical cord blood from three infants. This depth of sequencing had never been achieved before.

What they found was a surprisingly high frequency of shared clonotypes. "The overlap in antibody sequences in between individuals was unexpectedly high," Crowe explained, "even showing some identical antibody sequences between adults and babies at the time of birth."

Understanding this commonality is key to identifying antibodies that can be targets for vaccines and treatments that work more universally across populations.

The Human Vaccines Project is a nonprofit public-private partnership of academic research centers, industry, nonprofits and government agencies focused on research to advance next-generation vaccines and immunotherapies. This study was part of its Human Immunome Program, which aims to decode the genetic underpinnings of the immune system.

As part of a unique consortium created by the Human Vaccines Project, the San Diego Supercomputing Center applied its considerable computing power to working with the multiple terabytes of data. A central tenet of the Project is the merger of biomedicine and advanced computing.

"The Human Vaccines Project allows us to study problems at a larger scale than would be normally possible in a single lab and it also brings together groups that might not normally collaborate," said Robert Sinkovits, PhD, who leads scientific applications efforts at the San Diego Supercomputer Center.

Collaborative work is now underway to expand this study to sequence other areas of the immune system, B cells from older people and from diverse parts of the world, and to apply artificial intelligence-driven algorithms to further mine datasets for insights.

The researchers hope that continued interrogation of the immune system will ultimately lead to the development of safer and highly targeted vaccines and immunotherapies that work across populations.

"Decoding the human immune system is central to tackling the global challenges of infectious and non-communicable diseases, from cancer to Alzheimer's to pandemic influenza," Koff said. "This study marks a key step toward understanding how the human immune system works, setting the stage for developing next-generation health products through the convergence of genomics and immune monitoring technologies with machine learning and artificial intelligence."

Credit: 
Vanderbilt University Medical Center