Culture

Why a vacation seems like it will end as soon as it begins

COLUMBUS, Ohio - Time not only flies when you're having fun - sometimes anticipating a fun event makes it feel like it will be over as soon as it begins, a new study suggests.

Researchers found that people judge future positive events as being both farther away as well as shorter in duration than negative or neutral events.

Combining those two elements has a strange effect when people look forward to a positive event like a vacation, said Selin Malkoc, co-author of the study and associate professor of marketing at The Ohio State University's Fisher College of Business.

"The seemingly endless wait for the vacation to start combined with the feeling that the vacation will fly by leads people to feel like the beginning and the end of their time off as similarly far from the present," Malkoc said.

"In other words, in their mind's eye, the vacation is over as soon as it begins. It has no duration."

The study was published online recently in the Journal of Consumer Psychology.

This phenomenon has another interesting effect: It makes people feel like the endpoints of positive and negative events are similarly distant from the present.

That's because anticipating a negative event - like a dreaded work trip - reverses the effects of a positive event: People feel like the negative event is right around the corner and will last a long time.

"Thinking about future positive and negative events leads people to take two different paths to the same conclusion, with the ends of both events seeming similarly far away," said study co-author Gabriela Tonietto, assistant professor of marketing at Rutgers Business School - Newark and New Brunswick.

The Journal of Consumer Psychology paper included four related studies that came to similar conclusions. In one study, 451 online participants considered the upcoming weekend, which was either expected to be fun, terrible, or just OK.

They then indicated how far away the beginning and then the end of the weekend felt on a 0-100 slider scale (0=very near, 100=very far.)

Findings showed that a good weekend seemed farther away and shorter, while a terrible weekend seemed closer to the present day and longer in duration. An OK weekend fell in between.

On the slider scale, people rated a bad weekend as ending significantly farther away than its beginning. But for people who expected a good weekend, the slider scale ratings for how far away the beginning and the end seemed to them were nearly identical.

In fact, 46% of participants evaluated the positive weekend as feeling like it had no duration at all as they thought about both the event and the time leading up to it.

Thinking about how far the beginning and the end of the event is from the present is key to this phenomenon, Malkoc said. Another study showed that when people were asked to directly indicate how long they expected a positive event to last, they thought it would go quickly, but they did indicate it would take up some time.

It was only when people also considered the time leading up to the fun event - which they expected to crawl - that they thought a future positive experience would feel like it had no duration.

These findings have some interesting implications as people start planning vacations and other fun events as the COVID-19 pandemic ends, Malkoc said.

"If a vacation seems like it is going to end as soon as it begins, it may make people less likely to plan specific events during their time off," she said.

"It may also lead people to spend more on hotels and other luxuries, since it seems like the vacation is such a short time anyway."

Credit: 
Ohio State University

New paper establishes context for myopia control axial length targets

image: Paul Chamberlain, BSc (Hons), MCOptom, is the paper's lead author and Director of Research Programs for CooperVision.

Image: 
CooperVision

SAN RAMON, Calif., May 26, 2021--A new paper that has been accepted for publication in Ophthalmic & Physiological Optics, the peer-reviewed journal of The College of Optometrists (UK), furthers understanding of myopia control efficacy in the context of normal childhood eye growth. Axial Length Targets for Myopia Control (Chamberlain P, et al.) is now available online via Open Access.

Since young eyes grow, even when not myopic, the paper compares axial elongation among children who remain emmetropic, children with untreated myopia progression, and children with myopia managed with CooperVision MiSight® 1 day myopia control contact lenses. The comparison uses data from the three-year MiSight® 1 day clinical study, the Orinda Longitudinal Study of Myopia (OLSM)[1] and the Singapore Cohort Study of the Risk Factors for Myopia (SCORM).[2]

The analysis revealed that the predicted three-year axial elongation for emmetropic children (0.24 mm) is similar to the mean three-year elongation in MiSight® 1 day-treated children with myopia (0.30 mm).[3,4] In contrast, the axial elongation (0.63 mm) observed in the control group in the MiSight® 1 day clinical study was much higher, and similar to virtual cohorts based on the OLSM (0.70 mm) and SCORM (0.65 mm) models for myopia development.3,4

This observation suggests that while abnormal myopic eye growth may be managed with MiSight® 1 day in age-appropriate* children, normal, physiological eye growth may continue as the child ages. It supports the hypothesis that myopic axial elongation may be superimposed on underlying physiological axial elongation.

"Some eye growth as part of the aging process is normal--myopic axial elongation is not. Our work further validates that evidence-based interventions can be highly effective in slowing that myopic eye growth," said Paul Chamberlain, BSc (Hons), MCOptom, the paper's lead author and Director of Research Programs for CooperVision. "This offers hope to healthcare systems, currently bearing the growing burden of visual impairment linked to the increasing prevalence and severity of myopia."

MiSight® 1 day contact lenses designed for myopia control have been shown to reduce the rate of axial elongation in children (aged 8-12 at the initiation of treatment*) by 52% on average over a three-year period.†4 As measured by spherical refraction, the lens reduced the rate of myopia progression in age-appropriate children* by 59% on average over the same period. †[4]

"In assessing treatment effectiveness, we caution our industry away from applying arbitrary correction factors to account for normal, physiological eye growth until this has been better understood. Efficacy percentages may seem higher presented in that light, but it complicates understanding and valid comparisons," said co-author Mark Bullimore, MCOptom, PhD, FAAO. "Eye care professionals must be able to rely on the ever-growing body of myopia literature to make evidence-based clinical decisions."

Credit: 
McDougall Communications

UCSF improves fetal heart defect detection using machine learning

UC San Francisco researchers have found a way to double doctors' accuracy in detecting the vast majority of complex fetal heart defects in utero - when interventions could either correct them or greatly improve a child's chance of survival - by combining routine ultrasound imaging with machine-learning computer tools.

The team, led by UCSF cardiologist Rima Arnaout, MD, trained a group of machine-learning models to mimic the tasks that clinicians follow in diagnosing complex congenital heart disease (CHD). Worldwide, humans detect as few as 30 to 50 percent of these conditions before birth. However, the combination of human-performed ultrasound and machine analysis allowed the researchers to detect 95% of CHD in their test dataset.

The findings appear in the May issue of Nature Medicine.

Fetal ultrasound screening is universally recommended during the second trimester of pregnancy in the United States and by the World Health Organization. Diagnosis of fetal heart defects, in particular, can improve newborn outcomes and enable further research on in utero therapies, the researchers said.

"Second-trimester screening is a rite of passage in pregnancy to tell if the fetus is a boy or girl, but it is also used to screen for birth defects," said Arnaout, a UCSF assistant professor and lead author of the paper. Typically, the imaging includes five cardiac views that could allow clinicians to diagnosis up to 90 percent of congenital heart disease, but in practice, only about half of those are detected at non-expert centers.

"On the one hand, heart defects are the most common kind of birth defect, and it's very important to diagnose them before birth," Arnaout said. "On the other hand, they are still rare enough that detecting them is difficult even for trained clinicians, unless they are highly sub-specialized. And all too often, in clinics and hospitals worldwide, sensitivity and specificity can be quite low."

The UCSF team, which included fetal cardiologist and senior author Anita Moon-Grady, MD, trained the machine tools to mimic clinicians' work in three steps. First, they utilized neural networks to find five views of the heart that are important for diagnosis. Then, they again used neural networks to decide whether each of these views was normal or not. Then, a third algorithm combined the results of the first two steps to give a final result of whether the fetal heart was normal or abnormal.

"We hope this work will revolutionize screening for these birth defects," said Arnaout, a member of the UCSF Bakar Computational Health Sciences Institute, the UCSF Center for Intelligent Imaging, and a Chan Zuckerberg Biohub Intercampus Research Award Investigator. "Our goal is to help forge a path toward using machine learning to solve diagnostic challenges for the many diseases where ultrasound is used in screening and diagnosis."

Credit: 
University of California - San Francisco

UVA develops new tools to battle cancer, advance genomics research

image: UVA's Chongzhi Zang, PhD, and his colleagues and students have developed a new computational method to map the folding patterns of our chromosomes in three dimensions.

Image: 
Courtesy Zang lab at UVA

University of Virginia School of Medicine scientists have developed important new resources that will aid the battle against cancer and advance cutting-edge genomics research.

UVA's Chongzhi Zang, PhD, and his colleagues and students have developed a new computational method to map the folding patterns of our chromosomes in three dimensions from experimental data. This is important because the configuration of genetic material inside our chromosomes actually affects how our genes work. In cancer, that configuration can go wrong, so scientists want to understand the genome architecture of both healthy cells and cancerous ones. This will help them develop better ways to treat and prevent cancer, in addition to advancing many other areas of medical research.

Using their new approaches, Zang and his colleagues and students have already unearthed a treasure trove of useful data, and they are making their techniques and findings available to their fellow scientists. To advance cancer research, they've even built an interactive website that brings together their findings with vast amounts of data from other resources. They say their new website, bartcancer.org, can provide "unique insights" for cancer researchers.

"The folding pattern of the genome is highly dynamic; it changes frequently and differs from cell to cell. Our new method aims to link this dynamic pattern to the control of gene activities," said Zang, a computational biologist with UVA's Center for Public Health Genomics and UVA Cancer Center. "A better understanding of this link can help unravel the genetic cause of cancer and other diseases and can guide future drug development for precision medicine."

Bet on BART

Zang's new approach to mapping the folding of our genome is called BART3D. Essentially, it compares available three-dimensional configuration data about one region of a chromosome with many of its neighbors. It can then extrapolate from this comparison to fill in blanks in the blueprints of genetic material using "Binding Analysis for Regulation of Transcription", or BART, a novel algorithm they recently developed. The result is a map that offers unprecedented insights into how our genes interact with the "transcriptional regulators" that control their activity. Identifying these regulators helps scientists understand what turns particular genes on and off - information they can use in the battle against cancer and other diseases.

The researchers have built a web server, BARTweb, to offer the BART tool to their fellow scientists. It's available, for free, at http://bartweb.org. The source code is available at https://github.com/zanglab/bart2. Test runs demonstrated that the server outperformed several existing tools for identifying the transcriptional regulators that control particular sets of genes, the researchers report.

The UVA team also built the BART Cancer database to advance research into 15 different types of cancer, including breast, lung, colorectal and prostate cancer. Scientists can search the interactive database to see which regulators are more active and which are less active in each cancer.

"While a cancer researcher can browse our database to screen potential drug targets, any biomedical scientist can use our web server to analyze their own genetic data," Zang said. "We hope that the tools and resources we develop can benefit the whole biomedical research community by accelerating scientific discoveries and future therapeutic development."

Credit: 
University of Virginia Health System

Hundreds of antibiotic resistant genes found in the gastrointestinal tracts of Danish infants

Hundreds of antibiotic resistant genes found in the gastrointestinal tracts of Danish infants

Danish one-year-olds carry several hundred antibiotic resistant genes in their bacterial gut flora according to a new study from the University of Copenhagen. The presence of these genes is partly attributable to antibiotic use among mothers during pregnancy.

An estimated 700,000 people die every year from antibiotic resistant bacterial infections and diseases. The WHO expects this figure to multiply greatly in coming decades. To study how antibiotic resistance occurs in humans' natural bacterial flora, researchers from the University of Copenhagen's Department of Biology analysed stool samples from 662 Danish one-year-old children.

Within the samples, the researchers discovered 409 different genes, providing bacteria with resistance to 34 types of antibiotics. Furthermore, 167 of the 409 genes found are resistant to multiple types of antibiotics, including those classified as 'critically important' by the WHO for being able to treat serious diseases in the future.

"It's a wake-up call that one-year-old children are already carrying gut bacteria that are resistant to very important types of antibiotics. New resistant bacteria are becoming more widespread due to increased antibiotic consumption. The horror scenario is that we will one day lack the antibiotics needed to treat life-threatening bacterial infections such as pneumonia or foodborne illnesses," explains Department of Biology professor Søren Sørensen, who led the study.

Antibiotic use during pregnancy is an important factor

The important factor for whether an infant had more antibiotic-resistant genes in bacteria in the gut was if the child's mother had been administered antibiotics during late pregnancy or if the year-old infant had received antibiotics in the months prior to the collection of their stool samples.

"We found a very strong correlation between a mother's antibiotic treatment during late pregnancy and of infants and gut bacteria with many resistant genes, although it appears that other influences come into play as well," says Xuan Ji Li of the Department of Biology, the study's lead author.

At the same time, the researchers found a link between how well-developed the gut flora of children were and the concentration of resistant bacteria. Well-developed gut flora equated with a lesser incidence of resistant bacteria. Previous studies from the same group of children demonstrated that the development of gut flora is linked to asthma risk later in life.

E. coli collect resistant genes

Escherichia coli (E. coli) is common in the intestine and can lead to intestinal infections. But in this study, the researchers also learned that E. coli appears to act as a main collector and a potential spreader of antibiotic-resistant genes to other gut bacteria.

The researchers also found E. coli in infants with high concentrations of resistance genes in their intestinal tracts.

"The new findings have expanded our understanding of antibiotic resistance by showing us which bacteria act as collectors and potential spreaders of resistance genes. While we know that resistance is transferred among bacteria, we also now know that E. coli is one of the ones we need to keep a particularly close eye on," says Xuan Ji Li of the Department of Biology. Søren Sørensen adds:

"The new knowledge brought about by this study may prove useful in the effort to better manage antibiotic treatments among pregnant women and serve as a basis for more targeted methods of eliminating the types of bacteria which collect resistance genes."

Credit: 
University of Copenhagen - Faculty of Science

An 1% Hubble parameter estimation from LISA-Taiji gravitational wave observatory network

image: Averaged event number in 1-year, 3-year and 5-year observation time.

Image: 
©Science China Press

The Hubble parameter is one of the central parameters in the modern cosmology. Their values inferred from the late-time observations are systematically higher than those from the early-time measurements by about 10%. This is called the "Hubble tension". To come to a robust conclusion, independent probes with accuracy at percent levels are crucial. With the self-calibration by the theory of general relativity, gravitational waves from compact binary coalescence open a completely novel observational window for Hubble parameter determination. Hence, it can shed some light on the Hubble tension. Depends on whether being associated with electromagnetic counterparts or not, gravitational wave events can be categorized into bright sirens and dark sirens. The future space-borne gravitational wave observatory network, such as the LISA-Taiji network, will be able to measure the gravitational wave signals in the Millihertz bands with unprecedented accuracy. This advantage could help the measurement of Hubble constant.

A group of Chinese scientists from Beijing and Shenzhen forecast the ability of constraining the Hubble parameter by using GW sirens data from the future space-borne gravitational wave observatories, such as LISA (ESA/NASA space mission) and Taiji (Chinese space project). The signals are generated by the inspirals and mergers of massive black hole binary. Astronomers believe each galaxies host a central massive black hole, whose mass is about one part per thousand of their bulge mass. However, there are outliers. These galaxies just experienced a very violent collision with their neighbors. In the long history of universe, these galaxy merger events happened frequently and will trigger the star formation. Besides of that, the central massive black holes associated with the galaxies will also merge with each other and emit significant GW radiation in the very final phase of coalescence. There is a time delay between massive black hole merger and the galaxy merger. In the view of cosmic time scale, this delay is an instant. However, from the point of view of human lifetime, this delay is about million years. Hence, some galaxies will host two or three massive black holes. If the separation of these massive black holes are less than 0.001 persec, the gravitational wave emission will dominate the radiation transfer process and drive the hardening of the black hole binaries. Thanks to the excellent sensitivity of space gravitational wave observatories, we will be able to measure this tiny signal. And more importantly, this gravitational wave signal carries fruitful cosmological information.

By including several statistical and instrumental noises, the Chinese team show that "within 5 years operation time, the LISA-Taiji network is able to constrain the Hubble parameter within 1% accuracy, and possibly beats the scatters down to 0.5% or even better". Besides of that, they also calculate the averaged event numbers for different massive black hole formation models and several observation times. After 5-year network observation, for the optimistic heavy seed model, the averaged event number with the Hubble parameter accuracy better than 1% could reach 0.9 and its 95% confidence interval will up-cross unity. "We will very probably capture one gold or diamond event after 5-year network observation." the scientists forecast.

See the article:
Renjie Wang, Wen-Hong Ruan, Qing Yang, Zong-Kuan Guo, Rong-Gen Cai, Bin Hu
Hubble parameter estimation via dark sirens with the lisa-taiji network
Natl Sci Rev
https://doi.org/10.1093/nsr/nwab054

Credit: 
Science China Press

Coronavirus testing made quick and easy

image: KAUST scientists have combined state-of-the-art bioelectronic hardware, materials science engineering and synthetic biology protein design to simplify and accelerate coronavirus testing.

Image: 
© 2021 KAUST; Xavier Pita

A new rapid coronavirus test developed by KAUST scientists can deliver highly accurate results in less than 15 minutes.

The diagnostic, which brings together electrochemical biosensors with engineered protein constructs, allows clinicians to quickly detect bits of the virus with a precision previously only possible with slower genetic techniques. The entire set-up can work at the point of patient care on unprocessed blood or saliva samples; no laborious sample preparation or centralized diagnostic laboratory is required.

"The combination of state-of-the-art bioelectronic hardware, materials science engineering and synthetic biology protein design really makes it possible to simplify and accelerate coronavirus testing," says Raik Grünberg, a biochemist at KAUST who co-led the study.

Grünberg and his KAUST colleagues, including Sahika Inal and Stefan Arold, are now working with commercial partners to adapt their lab-scale prototype. They hope to create a bench-top, portable device that can be deployed to help contain the COVID-19 pandemic.

"This biosensor technology could be adapted to detect other pathogens and, as such, will have a major impact on controlling pandemics -- today and in the future," Inal says.

Coronavirus testing remains as important as ever. Despite increasing rates of vaccination in Saudi Arabia and around the world, global case numbers of COVID-19 remain at worryingly high levels, and public health officials need ways of rapidly identifying individuals who have contracted the disease so that they can limit virus transmission.

Current testing paradigms generally fall into two camps: either they detect viral RNA through genetic means, which can be slow and involves enzymatic amplification of trace molecular signals, or they capture viral proteins (known as antigens) in ways that are fast but not nearly as accurate.

The new KAUST technique now combines the speed of protein detection with the precision of genetic tests. "Even if there's only a single virus particle in a sample, our platform will detect it," says Keying Guo, a postdoc in Inal's lab and co-author of the study with his KAUST colleagues Shofarul Wustoni and Anil Koklu.

The system starts with a virus-specific nanobody, a type of binding protein that can be designed to stick to fragments of different coronaviruses, including those responsible for COVID-19 and Middle East respiratory syndrome (MERS). The nanobody is tethered through a series of biochemical linkers to a thin layer of gold that, when an electric current is added, controls the flow of electricity through the semiconducting film it is connected to. The presence of any nanobody-bound viral proteins changes that flow, creating a signal that is amplified to measurable levels by a device known as an organic electrochemical transistor.

The researchers initially honed their test on human saliva and blood samples spiked with bits of protein from the coronaviruses that cause MERS and COVID-19. They then collaborated with doctors from KAUST Health in Thuwal and with scientists from King Faisal Specialist Hospital and Research Center in Riyadh to validate the assay on clinical specimens, both saliva and nasal swabs, collected from patients.

The speed, versatility and performance compared to standard genetic testing highlight the potential for the new method to complement or possibly replace existing diagnostics for COVID-19 and any future pandemics.

Credit: 
King Abdullah University of Science & Technology (KAUST)

Small modular reactors competitive in Washington's clean energy future

RICHLAND, Wash.--As the Clean Energy Transformation Act drives Washington state toward carbon-free electricity, a new energy landscape is taking shape. Alongside renewable energy sources, a new report finds small modular reactors are poised to play an integral role in the state's emerging clean energy future.

The technology could help fill a power source gap soon to be left by carbon-emitting resources like coal and natural gas, which will be phased out in coming years, according to a report composed by researchers at the U.S. Department of Energy's Pacific Northwest National Laboratory and Massachusetts Institute of Technology.

"Nuclear energy is a reliable source of baseload electricity," said PNNL's manager for nuclear power systems Ali Zbib, who coauthored the report, "and our findings show that advanced small modular reactors could be economically competitive in a future carbon-free electricity sector. They're well-suited to play an important role in an energy market that requires more flexibility."

The report detailed how the new reactors could satisfy the Pacific Northwest's dynamic electricity demand, assessed the region's electricity market and explored the viability of deploying the reactors at three locations: at the Hanford Site, utilizing infrastructure from Energy Northwest's partially completed power plants; at the slated to be closed, coal-fired Centralia Big Hanaford power plant in Western Washington; and on the Idaho National Laboratory site, where Utah Associated Municipal Power Systems plans to build the first NuScale light-water reactor.

The findings outline advantages of building at each site, from a trained workforce residing in Eastern Washington to infrastructure in place at the Centralia plant. The report focused on two designs: NuScale's small modular reactor and GE Hitachi Nuclear Energy's BWRX-300.

"Nuclear energy can help to decarbonize every sector of the economy," said Jacopo Buongiorno, TEPCO professor of nuclear science and engineering at the Massachusetts Institute of Technology and co-author of the report. "The state of Washington is uniquely positioned to kick off this process by demonstrating the leading small modular reactor technologies well before the end of the decade, and that's exciting."

A changing energy landscape

The Clean Energy Transformation Act was enacted in 2019, mandating that Washington's power sources must generate electricity without emitting greenhouse gases by 2045. The law seeks carbon neutrality by 2030, with coal eliminated by 2025 and penalties for carbon-emitting sources imposed as soon as 2030.

Phasing out coal and natural gas, which supplied roughly 17 percent of Washington's overall fuel mix in 2018, will reduce power generation capacity by roughly five gigawatts--nearly equivalent to the output of four large nuclear power plants.

More power will likely be needed in the future, said Zbib, as the state continues to electrify different sectors of its economy, including the transportation sector, and as the region's population grows. Renewables in the form of wind and solar stand to fill some of the load. A complementary source of flexible power will be needed, however, to meet demand.

Small but significant

Enter small modular reactors, which are designed for flexible electricity generation. The designs discussed within the report employ water as a coolant, while other advanced designs employ gas or liquid metal coolants. The emissions-free technology has yet to be deployed in the United States, but designs have matured, including the two analyzed in this report.

Part of a larger investment spent over several years, the U.S. Department of Energy announced last year that two companies--X-energy and TerraPower--will receive $80 million each in initial funding to build advanced reactor demonstration plants that feature small modular reactors.

The size and modularity of the reactors offer unique advantages, according to the report. Smaller cores, less radiological material and layered safety barriers built into small modular reactors result in evacuation zones miles smaller than those of conventional reactors.

Components of small modular reactors can be manufactured and assembled off-site, too, then shipped and installed on-site, cutting construction costs and trimming project timelines, said Zbib.

GEH's design allows for multiple plants within a single site. NuScale's design is scalable, in that modules can be incrementally added--up to a dozen per site. A four-module plant could meet demand for small regions, capable of powering nearly 200,000 homes, while a twelve-module reactor can power a large city.

The need for flexibility

Advanced small modular reactors are flexible--they can continuously operate at full power to provide baseload energy or can follow power swings on the grid. They could play a key role within a clean electricity generation system that includes nuclear, renewables and battery storage.

The report's energy market assessment found that Washington's electricity demand can significantly fluctuate on a monthly, daily and even five-minute basis. The average daily demand in February 2019, for example, varied by more than 2,100 megawatts, roughly equal to twice the output of Energy Northwest's Columbia Generating Station, the only commercial nuclear power plant in the Pacific Northwest.

Swings in electricity generated by some sources can drive the need for a power source that quickly adjusts, as well. The report details one 2009 case in which 700 megawatts of power surged onto the Bonneville Power Administration grid from five short minutes of strong winds. Utilities must ramp down output from flexible sources during such surges.

Taking one or more power modules offline, in the case of the NuScale power plant design, or adjusting reactor power could allow operators to quickly adjust to swings.

Exploring locations

Building a plant at the Hanford Site, according to the report, could help cut costs in several ways. Much of the licensing and assessments that determine whether an area is fit for a nuclear power plant are already completed. Seismic risks and analyses of other vulnerabilities were carried out when the partially complete plants were built in the 1980s.

Nearly $140 million could be saved by using standing structures, according to a study described within the report, and the project schedule could be shortened, resulting in additional cost savings. And with Columbia Generating Station nearby, a trained workforce familiar with nuclear reactors offers another advantage.

The report finds that building at the Centralia plant, which will close all coal-fired boilers by 2025, has its own merits. Existing infrastructure can be harnessed, the facility is already connected to the grid and a new plant could replace jobs lost from the closure.

Building on the west side of the Cascade Mountains could reduce both transmission congestion and the need for transmission infrastructure to supply power to nearby high-demand areas like Seattle, Tacoma and Olympia.

With subsidies for clean energy and penalties for carbon-emitting resources in place, the report contends that plants at either location could provide competitively priced electricity.

"Decarbonizing the electric grid is essential in combating climate change," said Zbib. While wind and solar will play a critical role, he added, phasing out carbon-emitting resources sparks the need for flexible, non-carbon-emitting sources. "Nuclear energy can be an integral part of a clean energy portfolio that will allow the state of Washington to meet its clean energy objectives."

Credit: 
DOE/Pacific Northwest National Laboratory

Hidden genes discovered in bovine genome

Modern genetic research often works with what are known as reference genomes. Such a genome comprises data from DNA sequences that scientists have assembled as a representative example of the genetic makeup of a species.

To create the reference genome, researchers generally use DNA sequences from a single or a few individuals, which can poorly represent the complete genomic diversity of individuals or sub-populations. The result is that a reference does not always correspond exactly to the set of genes of a specific individual.

Until a few years ago, it was very laborious, expensive and time-consuming to generate such reference genomes. For this reason, researchers concentrated on human genomes and the most important biological model organisms, such as the roundworm C. elegans.

However, as researchers now have access to fast sequencing machines, sophisticated algorithms that assemble DNA sequence readouts into complete chromosomes, and much greater computing power, creating reference genomes for other species has become increasingly practical. If researchers are to better understand evolution and other fundamental questions of biology, they need high-quality reference genomes for as many species as possible.

This includes livestock. For domestic cattle (Bos taurus), only a single reference genome was available until recently: from a Hereford cow called Dominette. Researchers had previously compared other DNA sequences of cattle against this reference to detect genetic variations and define corresponding genotypes. However, as it did not contain any genetic variants by which individuals differ, the previous reference did not reflect the diversity of the species.

Gap filled

A research team led by Hubert Pausch, Assistant Professor of Animal Genomics at ETH Zurich, has now filled this gap: with the genomes of three further breeds of domestic cattle, including the Brown Swiss (Original Schweizer Braunvieh), two closely related (sub-)species such as the zebu and the yak, and the existing reference genome for domestic cattle, the researchers have created a "pangenome". The study detailing these findings has just been published in the scientific journal PNAS.

This cattle pangenome integrates sequences contained in the six individual reference genomes. "This means we can reveal very precisely which sequences are missing, for example, in the Hereford based reference genome, but are present in, say, our Brown Swiss genome or the genomes of other cattle breeds and species," Pausch says.

New genes and functionalities discovered

In this way, the ETH researchers discovered numerous DNA sequences and even whole genes that were missing in the previous reference genome of the Hereford cow. In a further step, the researchers investigated the transcripts of these genes (messenger RNA molecules), which allowed them to classify some of the newly discovered sequences as functionally and biologically relevant. Many of the genes they discovered are connected with immune functions: in animals that had contact with pathogenic bacteria, these genes were stronger or less active than in animals that had no contact with the pathogens.

This project was made possible by a new sequencing technology that has been available at the Functional Genomics Center Zurich for a year now. With this new technology, the researchers are able to precisely read out long DNA sections, reducing the complexity of the computing process needed to correctly assemble the analysed sections. "The new technology simplifies the genome assembly process. Now we can create reference genomes quickly and precisely from scratch," Pausch says. In addition, such analyses also cost less, meaning that researchers can now generate genomes in reference quality from many individuals of a species.

The ETH researchers are collaborating closely with the Bovine Pangenome Consortium, which wants to create a reference genome of at least one animal from every cattle breed worldwide. It also plans to analyse the genetic makeup of wild relatives of domestic cattle in this way.

More targeted breeding possible

The consortium and ETH professor Pausch hope that the reference genome collection will help them make useful discoveries such as genetic variants that are no longer present in domesticated animals, but that their wild relatives still possess. This would provide clues as to which genetic characteristics were lost as a result of domestication.

"Things get really exciting when we compare our indigenous cattle with the zebu or with breeds that are adapted to other climate conditions," Pausch explains. This lets researchers find out which genetic variants make animals in tropical environments more heat tolerant. The next step could be to deliberately use crossbreeding to introduce these variants into other cattle breeds or precisely introduce them through genome editing. However, that is still a long way off. For the present, researchers can benefit from the greater speed and precision that the new cattle pangenome brings to the process of detecting the genes and DNA variants that differ between cattle breeds.

Credit: 
ETH Zurich

Protein tenascin-C important in retinal blood flow disorders

image: Jacqueline Reinhard is researcher at the Department of Cell Morphology and Molecular Neurobiology.

Image: 
RUB, Marquard

Many eye diseases are associated with a restricted blood supply, known as ischaemia, which can lead to blindness. The role of the protein tenascin-C, an extracellular matrix component, in retinal ischaemia was investigated in mice by researchers from Ruhr-Universität Bochum (RUB). They showed that tenascin-C plays a crucial role in damaging the cells responsible for vision following ischaemia. The results were published online by the team in the journal Frontiers in Neuroscience on 20 May 2021.

As part of the research, the team around Dr. Susanne Wiemann and Dr. Jacqueline Reinhard from the Department of Cell Morphology and Molecular Neurobiology at RUB collaborated with Professor Stephanie Joachim's research group from the Experimental Eye Research Institute at the University Eye Clinic in Bochum.

Tenascin-C after retinal ischaemia

Ischaemia occurs due to an interruption in the supply of blood and nutrients to the retina - similar to a stroke. This causes the cells responsible for vision to die, which can lead to impaired vision or even blindness. The research team showed in mice that retinal cells express increased levels of tenascin-C at a very early stage following ischaemia. The quantity of the protein then gradually reduces again as the damage to the retina progresses. "Tenascin-C could therefore be a biomarker for the early detection of ischaemic eye conditions," says Jacqueline Reinhard.

Improved retinal function in mice without tenascin-C following ischaemia

The researchers also conducted electroretinogram analyses, allowing them to measure the electrical signal flow of the retina after a light stimulus. They thus showed that retinal ischaemia impairs the function of certain cell types in the retina: both the rod-photoreceptors and the bipolar cells, which are involved in downstream visual processing.

In genetically modified mice, who were unable to form tenascin-C, the cells responsible for vision in the retina functioned considerably better following ischaemic damage than those in control animals, who had tenascin-C. In addition, fewer photoreceptors died without tenascin-C after ischaemia.

Possible changes between the neuron contact points

The researchers also demonstrated elevated levels of the vesicular glutamate transporter vGlut1 in the ischaemic retina. "These could be linked to impaired synaptic signal transmission between the cells and contribute to cell death as a result of retinal ischaemia. Tenascin-C could be an important modulator here," assumes Jacqueline Reinhard. "Based on this knowledge, future therapy approaches could be developed to improve the treatment of ischaemia."

Credit: 
Ruhr-University Bochum

Research uncovers how 'non-professional' cells can trigger immune response

image: Worms with alterations in purine metabolism are shown triggering an immune response without pathogen infection. The immune response is revealed through green fluorescence in mutant worm intestines (middle and right) and not in control worms (left).

Image: 
Troemel Lab, UC San Diego

Included in the vast fallout stemming from the COVID-19 pandemic, scientists are paying closer attention to microbial infections and how life forms defend against attacks from pathogens.

Research led by University of California San Diego scientists has shed new light on the complex dynamics involved in how organisms sense that an infection is taking place.

UC San Diego Assistant Project Scientist Eillen Tecle in Professor Emily Troemel's laboratory (Division of Biological Sciences) led research focusing on how cells that are not part of the conventional immune system respond to infections when pathogens attack. Scientists have conducted extensive research on so-called "professional" immune cells that are defensive specialists. Much less is known about how "non-professional" cells handle such threats.

Tecle, Troemel and their colleagues at Pennsylvania State University focused their research on roundworms (Caenorhabditis elegans), animals that lack dedicated immune cells, to help decipher details of such dynamics.

As described in the journal PLOS Pathogens, the researchers conducted experiments involving roundworms under attack by viruses and microsporidia, which are natural pathogens of worms and humans. The results indicate that roundworms may sense changes in their metabolism in order to unleash protective defenses, even if they don't directly sense the pathogen incursion.

In their study, the researchers examined how hosts may respond when pathogens such as viruses and microsporidia steal key compounds from C. elegans cells known as nucleotides. Pathogens like these are required to pilfer such components from their hosts in order to survive. The study's results focused on biological pathways related to the breakdown of chemical compounds known as purine nucleotides. This purine metabolism pathway is key to the cells' ability to sense alterations as a way to induce an immune response.

"We hypothesize that the host has ways to surveil what's going on inside of its cells in an active process," said Tecle. "Our results suggest that the host has developed ways to sense the theft of purine metabolites. It seems that when these key cellular building blocks are stolen by the pathogen, the host senses this theft to mount an immune response to the pathogen."

This research may shed light on why purine-related compound mutations have been found to underlie many human diseases, including adenosine deaminase deficiency, which damages the immune system, and Lesch-Nyhan syndrome, which involves neurological and behavioral abnormalities. While these mutations result in various disorders in humans, they may persist in the human population to provide some protection against infections, for example during viral pandemics.

"Particularly in the context of the COVID-19 pandemic, it's so important that we continue to study these questions of immunity in lots of different systems to build new tools so that we can learn how to prevent and treat infections," said Troemel.

Credit: 
University of California - San Diego

Deciphering structure of a toxic matter that destroys the nerves in the brain

image: Schematic diagram of quadruple force mapping of hetero-oligomers derived from amyloid-beta and alpha-synuclein. Hetero-oligomers were characterized by the four types of AFM probes tethering an antibody recognizing each end of peptides.

Image: 
POSTECH

Alzheimer's disease - also called dementia - where memory and cognitive functions gradually decline due to deformation and death of neurons, and Parkinson's disease that causes tremors in hands and arms impeding normal movement are major neurodegenerative diseases. Recently, a research team at POSTECH has identified the structure of the agent that causes Alzheimer's and Parkinson's diseases to occur together.

A research team led by Professor Joon Won Park and Ph.D. candidate Eun Ji Shin of the Department of Chemistry at POSTECH investigated the surface structure of hetero-oligomers found in the overlap of Alzheimer's disease and Parkinson's disease, using an atomic force microscopy (AFM) to reveal their structural identity. This study was featured as the front cover paper in the latest issue of Nano Letters.

It is known that the pathological overlap of Alzheimer's disease and Parkinson's disease is associated with the formation of hetero-oligomers derived from amyloid-beta and alpha-synuclein. However, it was difficult to study the treatment due to technical limitations in observing their structure.

To this, the researchers used the AFM to observe the surface characteristic of the hetero-oligomer nano-aggregates derived from amyloid-beta, known as the biomarker of Alzheimer's disease, and alpha-synuclein, known as the biomarker of Parkinson's disease, at the single-molecule level.

When the research team investigated with four AFM tips immobilized with antibodies that recognize N-terminus or C-terminus of each peptide, it was confirmed that all aggregates were hetero-oligomers. In addition, in the case of hetero-oligomer, it was confirmed that the probability of recognizing the end of the peptide is higher than that of the homo-oligomer.

This result indicates that the end of each peptide has a bigger tendency to be located on the surface of hetero-oligomers than homo-oligomers, or that the ends of the peptides located on the surface have more degrees of freedom. That is, it can be confirmed that the aggregation between peptides is more loosely packed in the hetero-oligomer than in the homo-oligomer.

This study is the first study to observe the structure of protein disordered nano-aggregates, which has never been identified before, using the quadruple mapping with four AFM tips. It serves as experimental grounds to verify the hypothesis of hetero-oligomer aggregation. It can also be used in studies related to the overlapping phenomena of various neurodegenerative diseases other than Alzheimer's and Parkinson's.

"Until now, there was no adequate method to analyze the nano-aggregates, making it impossible to elucidate the structural identity of heterogeneous aggregates," explained Professor Joon Won Park. "As the analysis method developed in this study is applicable to other amyloid protein aggregates, it will help to identify the cause of diseases such as Alzheimer's or the mad cow disease."

Credit: 
Pohang University of Science & Technology (POSTECH)

Hacking and loss of driving skills are major consumer concerns for self-driving cars

A new study from the University of Kent, Toulouse Business School, ESSCA School of Management (Paris) and ESADE Business School (Spain) has revealed the three primary risks and benefits perceived by consumers towards autonomous vehicles (self-driving cars).

The increased development of autonomous vehicles worldwide inspired the researchers to uncover how consumers feel towards the growing market, particularly in areas that dissuade them from purchasing, to understand the challenges of marketing the product. The following perceptions, gained through qualitative interviews and quantitative surveys, are key to consumer decision making around autonomous vehicles.

The three key perceived risks for autonomous vehicles, according to surveyed consumers, can be classified as:

1. Performance (safety) risks of the vehicles' Artificial Intelligence and sensor systems

2. Loss of competencies by the driving public (primarily the ability to drive and use roads)

3. Privacy security breaches, similar to a personal computer or online account being hacked.

These concerns, particularly regarding road and passenger safety, have long been present in how automotive companies have marketed their products. Marketers' have advertised the continued improvements to the product's technology, in a bid to ease safety concerns. However, the concerns for loss of driving skills and privacy breaches are still of major concern and will need addressing as these products become more widespread.

The three perceived benefits to consumers were:

1. Freeing of time (spent instead of driving)

2. Removing the issue of human error (accidents caused by human drivers)

3. Outperforming human capacity, such as improved route and traffic prediction, handling speed.

Ben Lowe, Professor of Marketing at the University of Kent and co-author of the study said: 'The results of this study illustrate the perceived benefits of autonomous vehicles for consumers and how marketers can appeal to consumers in this growing market. However, we will now see how the manufacturers respond to concerns of these key perceived risks as they are major factors in the decision making of consumers, with the safety of the vehicles' performance the greatest priority. Our methods used in this study will help clarify for manufacturers and marketers that, second to the issue of online account security, they will now have to address concerns that their product is reducing the autonomy of the consumer.'

Credit: 
University of Kent

Raised buildings may help reduce malaria transmission in Africa

There is growing evidence that house design can decrease the force of malaria infection.

The world's most deadly assassin is Africa's malaria mosquito: Anopheles gambiae. In 2019, the World Health Organisation estimated that malaria killed 386,000 people in sub-Saharan Africa, mainly children.

Whilst we think of the home as a sanctuary, in Africa, around 80% of the malaria bites occur indoors at night. Preventing mosquitoes from getting indoors is a simple way of protecting people from this often lethal disease.

As most mosquitoes fly low to the ground, a team of researchers led by Durham University wondered whether if, by raising a house, malaria mosquitoes would struggle to find the occupants.

The findings are published in the Journal of the Royal Society Interface.

Using four experimental houses, the researchers found that the number of female An. gambiae mosquitoes collected in the huts declined with increasing height, decreasing progressively as the hut's floor moved further from the ground.

Huts with floors 3 metres above the ground had 84 % fewer mosquitoes than those on the ground. Interestingly, if this reduction correlates to a similar reduction in malaria transmission, it would be comparable to that of an insecticide-treated net that can reduce malaria transmission by 40-90 %.

Research lead author Professor Steve Lindsay, from the Durham University Department of Biosciences, said: "Working with a team of architects and builders from the Royal Danish Academy - Architecture, Design and Conservation, we constructed four experimental houses in The Gambia, each of which could be raised or lowered. Each week, one hut was on the ground, whilst the bottoms of the other huts were at 1m, 2m and 3m.

"Each night two men slept under separate mosquito nets in each hut and mosquitoes were collected indoors using a light trap. We changed the height of each house weekly so that, at the end of the 40 night experiment, each hut had been at each of the four heights for 10 nights.

"After analysing the results, we found that increasing the height of a hut progressively reduced the number of mosquitoes entering the hut and we think there are two reasons for this.

"First, malaria mosquitoes have evolved to find humans on the ground. Second, at higher heights, the carbon dioxide odour plumes coming out of the huts are rapidly dispersed by the wind, so mosquitoes find it more difficult to find a person to bite.

Study lead, Durham University PhD Student Ms Majo Carrasco-Tenezaca said: "These findings have real-world implications for the growing population of sub-Saharan Africa where An. gambiae s.l. is the major vector of malaria and places where high temperatures reduce the use of bed nets.

"Raising houses off the ground, like any intervention, is not evolutionary proof, and over time, mosquitoes may adapt and feed higher off the ground than before.

"Nonetheless, we recommend elevating houses off the ground since they are likely to reduce mosquito biting and keep the occupants cooler at night, and therefore more likely to sleep under an insecticide-treated net at night."

The United Nations has projected that the population of sub-Saharan Africa will more than double between 2019 and 2050, and the region will become the world's most populated by 2062.

Coincident with the increasing growth rate, there has been an unprecedented improvement in the housing stock in sub-Saharan Africa. With an additional 1.05 billion people by 2050, there has never been a better time to make houses healthier for people.

Credit: 
Durham University

Aquaculture turns biodiversity into uniformity along the coast of China

image: Fieldwork along the Chinese coast

Image: 
He-Bo Peng (NIOZ/RUG)

Fishery and aquaculture have given rise to an enormous uniformity in the diversity of bivalves along the more than 18,000 kilometer long Chinese coast, biologist He-Bo Peng and colleagues report in this month's issue of Diversity and Distributions.

Climate zones

Peng and colleagues sampled bivalves at 21 sites along the Chinese coast from the city of Dongliaodao in the tropical south, to the mudflats of Yalu Jiang, more than 2000 km further north and ice-covered for several months in winter. "At 19 out of these 21 sites, commercially exploited species dominated", Peng saw. "In the naturally occurring species, we still recognized the natural gradient with highest diversity in the tropics and lowest diversity in the north. However, the same commercially exploited species were found in all regions, regardless the climatic conditions. Also, the commercial species dominated almost all mudflat communities."

Shrimp fisheries

One of the dominant species nowadays is Potamocorbula laevis. Peng: "At some locations, this bivalve represented 95% of the biomass." This is not necessarily due to the culturing of this bivalve. "The intensive fishing for shrimps has almost eradicated a natural enemy of the small fragile juveniles of this bivalve. Possibly, this lack of predators has now favored P. laevis."

Blessing in disguise

The eradication of the south to north biodiversity gradient and the dominance of commercially exploited bivalve species may be a blessing in disguise for migratory birds along the Chinese coast. Peng: "At sites where there's a lot of p. laevis now, migratory birds like red and great knots find a lot of food as they refuel during their migrations between Australia and the tundra areas of northeast Russia. But the more general point raised by this study is the immensity of the role of human activities in determining coastline biodiversity. Where once there was a strong climate gradient in biodiversity, aquaculture and fisheries have favored uniformity now."

Continuous sampling

"This extensive research along more than 18,000 km of Chinese coast builds on the Dutch tradition of studying mudflat ecosystems", Theunis Piersma, researcher at NIOZ and promotor of PhD-student Peng as professor of global flyway ecology at the University of Groningen, says. "Sampling programs like SIBES in the Dutch Wadden Sea taught us the value of widespread and continuous sampling. In this study He-Bo Peng once again shows the value of documenting benthos at very large spatial scales, with sampling strategies developed in the Wadden Sea."

Credit: 
Royal Netherlands Institute for Sea Research